Menu
Join the Club

Your Bi-Weekly Dose Of Everything Optimism

News Summary

A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a significant advancement in making AI systems more energy-efficient. The research focuses on reducing the computational power required for large language models (LLMs) during the inference phase, which is when the trained model generates responses. The team developed a method that selectively …

A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a significant advancement in making AI systems more energy-efficient. The research focuses on reducing the computational power required for large language models (LLMs) during the inference phase, which is when the trained model generates responses. The team developed a method that selectively activates only the necessary parts of a neural network for a given task, rather than running the entire model. This approach, tested on models with hundreds of billions of parameters, reportedly achieved similar performance to standard models while using substantially less energy. The findings suggest a pathway toward more sustainable and cost-effective deployment of powerful AI. Read the full article at https://technologyreview.com/2024/05/15/energy-efficient-ai-method/

Join the Club

Like this story? You’ll love our Bi-Weekly Newsletter

Technology Review

Technology Review

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like

Ask Richard AI Avatar