A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces energy consumption. The method, called 'Lazy Training,' selectively updates only the most critical parts of a neural network during the learning process, rather than the entire model. Researchers found this technique could …
A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces energy consumption. The method, called ‘Lazy Training,’ selectively updates only the most critical parts of a neural network during the learning process, rather than the entire model. Researchers found this technique could cut the computational cost of training large language models by up to 80% without sacrificing final performance. The breakthrough addresses growing concerns about the massive carbon footprint associated with developing advanced AI systems. The team suggests this could make AI research more sustainable and accessible to organizations with limited resources. For the full details, read the complete article at https://technologyreview.com/2024/05/15/1095671/mit-ai-training-energy-efficient-lazy-method/.
Join the Club
Like this story? You’ll love our Bi-Weekly Newsletter



