A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and energy consumption. The method, called 'Lazy Training,' selectively updates only the most critical neurons within a neural network during the learning process, rather than adjusting all parameters. This targeted …
A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a novel approach to training AI models that significantly reduces computational costs and energy consumption. The method, called ‘Lazy Training,’ selectively updates only the most critical neurons within a neural network during the learning process, rather than adjusting all parameters. This targeted approach maintains model accuracy while cutting training time and resource use by up to 50% in initial tests. The research, published in Science Advances, suggests this technique could make advanced AI development more accessible and environmentally sustainable. Read the full article at https://technologyreview.com/2024/05/15/lazy-training-ai-models.
Join the Club
Like this story? You’ll love our Bi-Weekly Newsletter



