A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and energy consumption. The method, called 'Lazy Training,' selectively updates only the most critical parts of a neural network during the learning process, rather than the entire model. Researchers found …
A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and energy consumption. The method, called ‘Lazy Training,’ selectively updates only the most critical parts of a neural network during the learning process, rather than the entire model. Researchers found this technique could cut training time and resource use by up to 80% for certain tasks without sacrificing model accuracy. The breakthrough addresses growing concerns about the environmental impact and financial cost of developing large-scale AI systems. The team suggests this approach could make advanced AI research more accessible to organizations with limited computing resources. Read the full article at https://technologyreview.com/2024/05/15/lazy-training-ai-mit.
Join the Club
Like this story? You’ll love our Bi-Weekly Newsletter



