A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces their energy consumption and carbon footprint. The method, called 'Lazy Training,' selectively activates only the necessary parts of a neural network for a given task, rather than running the entire model. Researchers …
A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces their energy consumption and carbon footprint. The method, called ‘Lazy Training,’ selectively activates only the necessary parts of a neural network for a given task, rather than running the entire model. Researchers found this technique could cut energy use by up to 80% during the inference phase without sacrificing accuracy on benchmark tests. The development addresses growing concerns about the environmental impact of large-scale AI, which requires vast amounts of computational power and electricity. The team suggests this could make powerful AI models more accessible and sustainable for widespread deployment. Read the full article at: https://technologyreview.com/2024/05/15/lazy-training-ai-energy-efficiency
Join the Club
Like this story? You’ll love our Bi-Weekly Newsletter



