Advanced Micro Devices (AMD) on Thursday (October 10) unveiled a new artificial intelligence (AI) chip aimed at disrupting Nvidia’s hold on the lucrative data center GPU market. The launch of AMD’s Instinct MI325X accelerator signals an intensification of the AI hardware arms race, with implications for companies investing in AI.
AMD’s announcement was made during the Advancing AI 2024 event, where the company unveiled a broad portfolio of data center solutions for AI, enterprise, cloud, and mixed workloads. The portfolio includes the new Instinct MI325X accelerator, 5th generation AMD EPYC server CPUs, AMD Pensando Salina DPUs, AMD Pensando Pollara 400 NICs, and AMD Ryzen AI PRO 300 series processors for enterprise AI PCs.
The generative AI boom, fueled by technologies such as large-scale language models, has created a high demand for powerful GPUs capable of training and running complex AI systems. Nvidia has been a major beneficiary of this trend, with its data center revenue surging in recent earnings reports.
“Nvidia’s dominant position in the AI chip market remains virtually unchallenged,” Max (Chong) Li, adjunct professor at Columbia University and founding CEO of Oort, a decentralized AI data provider, told PYMNTS told. “AMD’s new chips should bring at least some competition, which could lead to price pressure in the long run. Some reports say Nvidia is earning as much as 75% profit margins on AI chips. It is estimated that once AMD starts eating into market share, we would expect prices to start falling, which is common in most industries when companies compete for customers.
Battle for AI chip supremacy
The CUDA ecosystem is Nvidia’s proprietary parallel computing platform and programming model that has become the standard for AI and high-performance computing tasks. AMD’s challenge extends beyond hardware performance to providing a compelling software ecosystem for developers and data scientists.
AMD has invested in its ROCm (Radeon Open Compute) software stack and reported at the event that it has doubled the inference and training performance of its AMD Instinct MI300X accelerator across popular AI models. The company says over 1 million models now work seamlessly on AMD Instinct, triple the number that was available at the time of MI300X launch.
“While the launch of AMD’s Instinct MI325X chip is an important step in challenging NVIDIA’s dominance in the data center GPU market, it is unlikely to immediately dramatically change the competitive landscape,” said support automation company QueryPal. CEO Dev Nag told PYMNTS. “NVIDIA’s 95% market share in AI chips is deeply rooted, primarily due to the company’s mature and dominant CUDA ecosystem.
“The success of AMD’s efforts will depend not only on the performance of the chips, but also on the responsiveness of the software,” he added. “NVIDIA spends approximately 30% of its R&D budget on software and has more software engineers than hardware engineers. This means NVIDIA continues to aggressively advance its leadership in the ecosystem. I will.”
Impact on business and AI markets
AMD’s entry into the market could have an impact on companies considering adopting AI technology. Increasing competition can lead to more choices and lower prices in the long run. Over the next two to three years, Nag said, “As AMD improves its products and potentially gains market share, we may find more options at different price points. AI hardware could become more accessible to small and medium-sized businesses where prices are lower in the market.”
Prices are unlikely to come down anytime soon, Nag said.
“Current demand for AI chips far exceeds supply, and manufacturers have little incentive to lower prices,” he told PYMNTS. “Rather than drastically undercutting Nvidia on price, AMD appears to be positioning itself as a worthy alternative.”
AMD’s focus on open standards could have broader implications.
“If successful, it could lead to more cost-effective solutions that reduce reliance on proprietary ecosystems like CUDA,” Nag said. “This approach could foster greater interoperability and flexibility in AI development, making it easier for enterprises to adopt and integrate AI solutions.”
Industry partners responded positively to AMD’s announcement. The company showcased collaborations with major companies including: Dell, google cloud, HPE, lenovo, Meta, microsoft, oracle cloud infrastructure and super micro.
AMD Chairman and CEO Lisa Su said in a statement that the data center AI accelerator market could grow to $500 billion by 2028. Even a small portion of this market could be hugely profitable for AMD, making its move into AI chips an important strategic move.
For companies in a variety of sectors, from retail to manufacturing, developing a more competitive AI chip market could accelerate the integration of AI into core operations and customer-facing services. More accessible and powerful AI hardware could make tasks like demand forecasting, process optimization, and personalized customer experiences achievable for a wider range of businesses.
“Lowering prices always lowers the barrier to entry, making new technology available to more companies and people,” Lee said. “Take cell phones, for example. When they debuted, the public perception of cell phone users was that of wealthy people who drove luxury cars and made calls on the go. Most people in developed countries and many in emerging countries tend to own at least a basic smartphone.We may soon see a similar adoption boom in access to AI.