Join the Club

Your Bi-Weekly Dose Of Everything Optimism

News Summary

A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces energy consumption. The method, called 'Lazy Training,' allows neural networks to learn effectively while using only a fraction of the computational power typically required. Researchers found that by carefully initializing the model …

A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces energy consumption. The method, called ‘Lazy Training,’ allows neural networks to learn effectively while using only a fraction of the computational power typically required. Researchers found that by carefully initializing the model and selectively updating only the most critical parameters during training, they could achieve comparable accuracy to standard methods while cutting energy use by up to 80%. This advancement addresses growing concerns about the substantial carbon footprint associated with large-scale AI development and could make sophisticated AI tools more accessible by lowering the cost of training. The team’s findings were published in the journal Nature Machine Intelligence. Read the full article at https://technologyreview.com/2024/05/15/lazy-training-ai-energy-study.

Join the Club

Like this story? You’ll love our Bi-Weekly Newsletter

Technology Review

Technology Review

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Ask Richard AI Avatar