A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces energy consumption. The method, called 'Lazy Training,' allows neural networks to learn effectively while keeping many of their parameters in a dormant state, akin to a sparse model. This contrasts with traditional …
A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces energy consumption. The method, called ‘Lazy Training,’ allows neural networks to learn effectively while keeping many of their parameters in a dormant state, akin to a sparse model. This contrasts with traditional training, which activates the entire network. Early results show the technique can cut energy use by up to 80% during the training phase without sacrificing model accuracy on benchmark tasks. The research addresses growing concerns about the substantial carbon footprint associated with developing large-scale AI systems. For more details, read the full article at https://technologyreview.com/2024/10/15/example-article.
Join the Club
Like this story? You’ll love our Bi-Weekly Newsletter



