A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and environmental impact. The method, called 'Lazy Learning,' focuses on training only the most relevant parts of a neural network for a given task, rather than the entire model. Researchers …
A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and environmental impact. The method, called ‘Lazy Learning,’ focuses on training only the most relevant parts of a neural network for a given task, rather than the entire model. Researchers found this technique could cut training costs by up to 80% while maintaining high accuracy on benchmark tests. The approach challenges the prevailing trend of building ever-larger models and suggests a more efficient path forward for AI development. The full details of the research are available in the published paper at https://technologyreview.com/2024/05/15/lazy-learning-ai.
Join the Club
Like this story? You’ll love our Bi-Weekly Newsletter



