Vijay Gadepally, a senior staff member at MIT Lincoln Laboratory, leads a number of projects at the Lincoln Laboratory Supercomputing Center (LLSC) to increase the efficiency of computing platforms and the artificial intelligence systems that run on them. I’m doing it. Here, Gadepally discusses the increasing use of generative AI in everyday tools, its hidden environmental impacts, and some of the ways Lincoln Laboratory and the larger AI community can reduce emissions toward a greener future. Here’s how.
Q: What trends are you seeing in how generative AI is used in computing?
A: Generative AI uses machine learning (ML) to create new content, such as images and text, based on data input into an ML system. At LLSC, we design and build one of the world’s largest academic computing platforms, and the number of projects requiring access to high-performance computing for generative AI has exploded in recent years. . We’re also seeing how generative AI is changing all kinds of sectors and domains. For example, ChatGPT is already impacting classrooms and workplaces faster than regulations can keep up.
Over the next decade or so, we can imagine all sorts of uses for generative AI, including powering highly capable virtual assistants, developing new drugs and materials, and even improving our understanding of basic science. While we cannot predict everything that generative AI will be used for, we do know that its impact on computing, energy, and climate will continue to grow rapidly as algorithms become increasingly complex.
Q: What strategies does LLSC use to reduce this climate impact?
A: We’re always looking for ways to make computing more efficient. Doing so will enable our data centers to make the most of their resources and enable our scientific colleagues to advance their fields in the most efficient way possible.
As an example, we’ve reduced the amount of power our hardware consumes by making simple changes like dimming or turning off lights when we leave a room. In one experiment, enforcing a power cap reduced the energy consumption of a group of graphics processing units by 20 to 30 percent with minimal performance impact. This technology also lowers the operating temperature of the hardware, allowing the GPU to cool better and extend its lifespan.
Another strategy is to change our behavior to be more climate-friendly. At home, some may choose to use renewable energy sources or intelligent scheduling. We use a similar approach at LLSC. For example, train an AI model when temperatures are cold or when energy demand on the local power grid is low.
They also realized that much of the energy spent on computing was often wasted, with water leaks increasing bills without any benefit to the home. We have developed several new techniques that allow us to monitor running computing workloads and terminate those that are unlikely to yield positive results. Surprisingly, we have found that in many cases it is possible to terminate large portions of the calculation early without compromising the final result.
Q: What are some examples of projects you have done that reduce the energy output of generative AI programs?
A: We recently built a climate-aware computer vision tool. Computer vision is a field focused on applying AI to images. This means you can distinguish between a cat and a dog in an image, correctly label objects in an image, and find components of interest in an image.
Our tool incorporates real-time carbon telemetry that generates information about the amount of carbon emitted from the local grid during model execution. Depending on this information, our system automatically switches to a more energy-efficient version of the model, typically with fewer parameters, when carbon intensity is high, and a more energy-efficient version of the model when carbon intensity is low. will automatically switch to the higher version of the model. .
This has been shown to reduce carbon emissions by nearly 80% within 1-2 days. We recently extended this idea to other generative AI tasks, such as text summarization, and saw the same results. Interestingly, we sometimes experienced improved performance after using our technique.
Q: As consumers of generative AI, what can we do to reduce our climate impact?
A: As consumers, we can demand that AI providers provide greater transparency. For example, Google Flights shows you different options that show you the carbon footprint of a particular flight. We need similar types of measurements from generative AI tools so we can make conscious decisions about which products and platforms to use based on our priorities.
You can also try to learn more about generative AI emissions in general. Many of us are familiar with car emissions, so it helps to compare and talk about the emissions of generated AI. For example, one image generation task is roughly equivalent to driving four miles in a gas car, and the amount of energy required to charge an electric car is the same as the amount of energy required to generate approximately 1,500 pieces of text. People may be surprised to know that. summary.
There are many cases where customers would be willing to make a trade-off if they knew the implications of the trade-off.
Q: What are your future plans?
A: Reducing climate impact through generative AI is one of the issues that people around the world are working on with similar goals. We do a lot of research here at Lincoln Laboratory, and that’s just scratching the surface. In the long term, data centers, AI developers, and the energy grid will need to work together to provide an “energy audit” and uncover other unique ways that computing efficiency can be improved. We need more partnerships and more cooperation as we move forward.
If you would like more information or are interested in collaborating with Lincoln Laboratory on these efforts, please contact Vijay Gadepally.