Join the Club

Your Bi-Weekly Dose Of Everything Optimism

News Summary

A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to AI training that significantly reduces energy consumption. The method, called 'Lazy Training,' selectively updates only the most critical parts of a neural network during the learning process, rather than the entire model. Researchers found this technique could cut …

A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to AI training that significantly reduces energy consumption. The method, called ‘Lazy Training,’ selectively updates only the most critical parts of a neural network during the learning process, rather than the entire model. Researchers found this technique could cut the computational cost—and associated carbon footprint—of training large models by up to 80% in some cases, without sacrificing final model performance. The breakthrough addresses growing concerns about the environmental impact of the massive data centers required for modern AI development. The full research paper, ‘Efficient Neural Network Training via Adaptive Parameter Update,’ is available for review. Read the full article at: https://technologyreview.com/2024/05/15/ai-training-energy-efficient-lazy-method

Join the Club

Like this story? You’ll love our Bi-Weekly Newsletter

Technology Review

Technology Review

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Ask Richard AI Avatar