Menu
Join the Club

Your Bi-Weekly Dose Of Everything Optimism

News Summary

A new study from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a significant advancement in making AI systems more energy-efficient. Researchers developed a method that allows large language models (LLMs) to generate their own training data, a process called 'self-training,' which reduces the need for vast, human-created datasets. This approach not only cuts …

A new study from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) demonstrates a significant advancement in making AI systems more energy-efficient. Researchers developed a method that allows large language models (LLMs) to generate their own training data, a process called ‘self-training,’ which reduces the need for vast, human-created datasets. This approach not only cuts down on the immense computational power and associated carbon emissions typically required for training but also maintains, and in some cases improves, model performance on reasoning tasks. The technique could pave the way for more sustainable development of powerful AI models without compromising their capabilities. Read the full article at: https://technologyreview.com/2024/07/15/1094750/ai-models-can-teach-themselves-to-be-more-energy-efficient/

Join the Club

Like this story? You’ll love our Bi-Weekly Newsletter

Technology Review

Technology Review

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Ask Richard AI Avatar