Menu
Join the Club

Your Bi-Weekly Dose Of Everything Optimism

News Summary

A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a significant advancement in making AI systems more energy-efficient. The research focuses on reducing the computational power required for large language models (LLMs) during the inference phase, which is when the trained model generates responses. The team developed a method that selectively …

A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a significant advancement in making AI systems more energy-efficient. The research focuses on reducing the computational power required for large language models (LLMs) during the inference phase, which is when the trained model generates responses. The team developed a method that selectively uses only the necessary parts, or ‘layers,’ of a model for a given query, bypassing others. This approach, tested on models with up to 13 billion parameters, achieved performance comparable to standard models while reducing energy consumption by over 50% in some cases. The technique could make running powerful AI models more feasible on smaller devices and data centers, potentially lowering costs and environmental impact. Read the full article at https://technologyreview.com/2024/07/10/1094775/mit-ai-energy-efficiency-skip-layers.

Join the Club

Like this story? You’ll love our Bi-Weekly Newsletter

Technology Review

Technology Review

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like

Ask Richard AI Avatar