A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and environmental impact. The method, called 'Lazy Training,' allows neural networks to learn effectively with far fewer parameters activated during the training process, challenging the conventional wisdom that bigger, denser …
A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and environmental impact. The method, called ‘Lazy Training,’ allows neural networks to learn effectively with far fewer parameters activated during the training process, challenging the conventional wisdom that bigger, denser models are always better. Researchers found that by strategically keeping large portions of the network in a dormant state, they could achieve comparable accuracy to fully dense models on several benchmark tasks while using a fraction of the energy. This breakthrough could pave the way for more efficient and sustainable AI development, especially as model sizes continue to grow exponentially. The full details of the research are available in the published paper. Read the full article at https://technologyreview.com/2024/03/15/ai-lazy-training-breakthrough.
Join the Club
Like this story? You’ll love our Bi-Weekly Newsletter



