Amazon (AMZN) is ubiquitous in today’s world, not only because it’s one of the largest and most established online marketplaces, but also because it’s one of the largest data center providers.
What many people don’t know is that Amazon owns and operates a nuclear power plant.
But the company’s cloud subsidiary, AWS, did just that in March, purchasing a $650 million nuclear-powered data center from Pennsylvania-based Talen Energy.
On the surface, the deal signals Amazon’s ambitious expansion plans. But dig deeper and the company’s nuclear power plant purchase speaks to a broader problem Amazon and other tech giants are grappling with: the insatiable demand for energy caused by artificial intelligence.
In Amazon’s case, AWS purchased Talen Energy’s nuclear data center in Pennsylvania, placing its rapidly expanding AI data center next to the power source to meet the energy demands created by artificial intelligence.
The strategy signals a rethinking of energy concerns that has grown as AI becomes more prevalent in consumers’ daily lives, permeating everything from internet searches to smart devices and cars.
Companies like Google (GOOG, GOOGL), Apple (AAPL), and Tesla (TSLA) continue to enhance their AI capabilities with new products and services. Each AI task requires huge amounts of computing power, which translates into significant power consumption by energy-hungry data centers.
Estimates suggest that by 2027, global AI-related electricity consumption could grow by 64% to reach 134 terawatt-hours per year, which is comparable to the electricity usage of countries such as the Netherlands or Sweden.
This raises an important question: How will big tech companies address the energy demands of future AI innovation?
Increased AI energy consumption
According to Pew Research, more than half of Americans interact with AI at least once a day.
Sasha Luccioni, a renowned researcher and data scientist who serves as head of AI and climate at Hugging Face, a company that develops tools for AI applications, frequently discusses the energy consumption of AI.
Luccioni explained that training an AI model requires a lot of energy (for example, training the GPT-3 model used about 1,300 megawatt-hours of electricity) and is typically done just once, but the inference phase, where the model produces responses, can require much more energy due to the sheer volume of queries.
For example, when a user asks an AI model like ChatGPT a question, a request is sent to a data center, where powerful processors generate a response — a process that’s fast but consumes about 10 times more energy than a typical Google search.
The story continues
“Models accumulate quickly because they’re used over and over again,” said Luccioni, who noted that depending on the size of the model, 50 million to 200 million queries can consume as much energy as training the model itself.
“ChatGPT is getting 10 million users a day,” Luccioni said, “which means that within 20 days, the amount of energy used to train the model through deployment has reached a ‘massive’ amount.”
The biggest consumers of this energy are the big tech companies known as hyperscalers, whose cloud services allow them to rapidly scale AI efforts. Microsoft (MSFT), Alphabet, Meta (META), and Amazon alone are projected to spend $189 billion on AI in 2024.
Increasing energy consumption from AI will put additional strain on already overburdened power grids: Goldman Sachs predicts that global data center electricity demand could grow 160% by 2030 and account for 8% of total U.S. electricity demand (up from 3% in 2022).
This burden is exacerbated by aging infrastructure and the growing electrification of automobiles and manufacturing in the U.S. According to the U.S. Department of Energy, 70% of U.S. power transmission lines are nearing the end of their typical 50- to 80-year lifespan, increasing the risk of outages and cyberattacks.
Moreover, renewable energy sources are struggling to keep pace.
Luccioni noted that while renewable energy generation is expanding, grid operators are ramping up the use of coal-fired power plants to meet growing energy demand.
AI upends Big Tech’s sustainability pledges
Microsoft and Google have acknowledged in their sustainability reports that AI is impeding their ability to meet climate goals: Microsoft’s carbon footprint, for example, has increased 29% since 2020 due to AI-related datacenter construction.
Still, renewable energy will remain a key part of Big Tech’s strategy, even if it can’t meet all of AI’s energy needs.
In May 2024, Microsoft signed the largest corporate power purchase agreement on record with real estate and asset management giant Brookfield to deliver more than 10.5 gigawatts of new renewable power capacity around the world through wind, solar and other carbon-free generation technologies. Additionally, the company is investing heavily in carbon removal efforts to offset an industry-record 8.2 million tonnes of emissions.
Amazon is also investing heavily in renewable energy, establishing itself as the world’s largest corporate renewable energy buyer for the fourth year in a row, and its portfolio currently includes enough wind and solar power to power 7.2 million U.S. homes for a year.
But as Yahoo Finance reporter Inés Ferré points out (video above), “The problem with renewable energy is that you might not use the energy at certain times of the day, so you also have to store it.”
As well as sourcing cleaner energy, big tech companies are also investing in efficiency: Luccioni said companies like Google are now developing AI-specific chips such as tensor processing units (TPUs) that are optimised for AI tasks, instead of graphics processing units (GPUs) made for gaming tech.
Nvidia claims that its latest Blackwell GPUs can reduce the energy usage and cost of AI models by up to 25 times compared to previous versions.
To get a glimpse into the future of technology companies that don’t control their energy costs, one only needs to look at Taiwan Semiconductor Manufacturing Company (TSM). TSMC makes more than 90% of the world’s most advanced AI chips, but CFO Wendell Huang says energy costs have doubled in the past year, cutting the company’s profit margins by almost a percentage point.
Transparency is key to better gauge energy demand and reduce future costs, experts say.
“We need more regulation, especially regarding transparency,” said Luccioni, who is working on an AI Energy Star rating project that aims to help developers and users choose more energy-efficient models by benchmarking energy consumption.
When it comes to tech companies’ priorities, they must always follow the money, or in this case, investment: Utility companies and tech giants are expected to spend $1 trillion on AI over the next few years.
But Luccioni says AI isn’t just a problem — it could also be part of the solution to address this energy shortage.
“AI can definitely be part of the solution,” Luccioni said, “for example, knowing when hydroelectric dams need repairs, or fixing leaks in aging infrastructure like cables. In fact, a lot of energy is lost in transmission and storage, so AI can be used to predict that and repair it in real time.”
For the latest technology news impacting the stock market, click here
Read the latest financial and business news from Yahoo Finance