Amazon on Friday announced an additional $4 billion investment in AI startup Anthropic. The deal includes an agreement for Anthropic to use more of Amazon’s AI chips. The cloud giant is taking on Nvidia and trying to get developers to switch away from those GPUs.
Amazon’s Trainium chip is about to get even busier. At least that’s what Amazon hopes to do after plowing another $4 billion into AI startup Anthropic.
The companies announced a huge new deal on Friday that brings Amazon’s total investment in Anthropic to $8 billion. The primary purpose of this funding is to enable Amazon’s AI chips to be used more frequently for training and running language models at scale.
In return for the cash infusion, Anthropic said it will use AWS as its “primary cloud and training partner.” It also said it will help Amazon design future Trainium chips and help build the Amazon AI model development platform called AWS Neuron.
This is an all-out attack on Nvidia, which dominates the AI chip market with its GPU, server, and CUDA platforms. Nvidia stock fell more than 3% on Friday after the Amazon and Anthropic news broke.
The challenge is getting Anthropic to actually use the Trainium chip at scale. For AI model developers, switching away from Nvidia GPUs is complex, time-consuming, and risky. Amazon has struggled with this.
Earlier this week, Anthropic CEO Dario Amodei didn’t seem fully committed to Amazon’s Trainium chip, despite an additional $4 billion in funding.
“We use Nvidia, but we also use custom chips from Google and Amazon,” he said at the Cerebral Valley technology conference in San Francisco. “Different chips have different tradeoffs. I think we’re getting value from all of them.”
In 2023, Amazon made its first investment in Anthropic, agreeing to spend $4 billion. The deal came with similar terms. At the time, Anthropic said it would use Amazon’s Trainium and Inferentia chips to build, train, and deploy future AI models, and that the two companies would collaborate on developing the chip technology.
It’s unclear whether Anthropic followed through with that. The Information recently reported that Anthropic preferred to use Nvidia GPUs over Amazon AI chips. The investment talks were focused on Anthropic becoming more aggressive in using Amazon’s services, the publication said.
With an additional $4 billion from Amazon, there are signs Anthropic could commit further.
In an announcement Friday, Anthropic said it is collaborating with Amazon on its Neuron software, which provides the critical connective tissue between chips and AI models. This conflicts with Nvidia’s CUDA software stack, which is the true enabler of Nvidia GPUs, and makes these components extremely difficult to swap out to other chips. Nvidia has had a 10-year head start with CUDA, and competitors are finding it difficult to overcome.
Anthropic’s “deep technical collaboration” signals a new level of commitment to using and improving Amazon’s Trainium chip.
While several companies make chips that compete with or exceed Nvidia in certain elements of computing performance, no other chips have made an impact on the company in terms of market or mind share.
Related articles
Amazon’s AI chip history
Amazon joins a small list of cloud providers that are equipping their data centers with their own AI chips to avoid spending heavily on Nvidia GPUs, which often have profit margins exceeding 70%.
Amazon debuted its Trainium and Inferentia chips in 2020. The chip is named after the training and inference tasks it was built for.
The aim was to find a way to reduce dependence on Nvidia and make cloud computing cheaper in the AI era.
Amazon CEO Andy Jassy said on an October earnings call that “as customers move closer to larger-scale implementations, we quickly realize that the cost of AI can become higher.” “That’s why we invested in our own custom silicon in Trainium for training and Inferentia for inference.”
But like many of its competitors, Amazon is finding it difficult to overcome the industry’s preference for Nvidia. Some say it’s because of CUDA, which offers a rich software stack with plenty of libraries, tools, and troubleshooting help. Some say it’s a simple habit or custom.
In May, Bernstein analyst Stacey Rasgon told Business Insider that she was not aware of any companies using Amazon AI chips at scale.
Friday’s announcement could change that.
Jassy said in October that the next-generation Trainium 2 chip has improved performance. “There’s been a lot of interest in these chips, and we’ve gone to our manufacturing partners multiple times to produce much larger quantities than we originally planned,” Jassy said.
Still, Anthropic’s Amodei sounded like he was hedging his bets this week.
“We believe our mission is best served as an independent company,” he said. “Given our position in the market and what we’ve been able to do, and our independent partnerships with Google, Amazon, and other companies, I think this is very achievable.”