Nvidia (NVDA) has had the kind of year most companies can only dream of.
Thanks to a visionary investment in artificial intelligence technology, the company’s revenue and stock price have skyrocketed. These investments are paying off in a big way following the wave of generative AI.
That’s not all. The company has replaced Apple (AAPL) multiple times as the world’s largest publicly traded company by market capitalization, exceeding $3 trillion. CEO Jensen Huang has become one of Silicon Valley’s most sought-after executives, meeting with everyone from technology luminaries to world leaders and even some.
There’s more to come. The company is ramping up production of high-performance Blackwell chips for AI applications, and expects to ship billions of dollars worth of hardware in the fourth quarter alone, with more shipments expected throughout the year. It is.
Futurum Group CEO Daniel Newman told Yahoo Finance, “NVIDIA really has the (hardware and software) for the AI computing era.” “Everything is connected inside and outside the (server) rack, and the software is very well received within the developer community.”
But competition is not sitting idle.
Companies like AMD (AMD) are trying to steal away Nvidia’s customers and cut into its estimated 80-90% market share. Even Nvidia’s own customers are working on chips aimed at reducing the graphics giant’s reliance on semiconductors.
And Wall Street is getting in on the action.
Shares of Broadcom (AVGO), which designs AI chips in collaboration with companies like Google (GOOG, GOOGL), are up 113% since the beginning of the year, and CEO Hock Tan says AI is worth $60 billion. It soared 44% in just the last month after saying it could. For the company, it represents a $90 billion opportunity in 2027 alone.
Still, competing against Nvidia will be a difficult task for any company. And it will be almost impossible to dethrone AI by at least 2025.
Nvidia had a first-mover advantage in the AI market thanks to its early investments in AI software that enabled its graphics chips to be used as high-performance processors. And the company has been able to maintain its lead in the field thanks to continued advances in its hardware and Cuda software, which allows developers to build apps for its chips.
So so-called hyperscalers, large cloud computing providers including Microsoft (MSFT), Alphabet Inc.’s Google, Amazon (AMZN), Meta (META) and others, are looking to buy as many Nvidia chips as possible. They keep pouring in cash. In its most recent quarter, Nvidia reported total revenue of $35.1 billion. Of this, 87%, or $30.8 billion, came from the data center business.
story continues
“Everyone wants to build and train these huge models, and the most efficient way to do it is with CUDA software and Nvidia hardware,” says President of TECHnaracy Research Bob O’Donnell, lead analyst, told Yahoo Finance.
Nvidia is expected to continue to power much of the AI industry in 2025. The company’s Blackwell chips are the successor to the Hopper series of popular processors needed to power AI applications, and are currently in production. And customers like Amazon are already adding new cooling features to their data centers to deal with the tremendous heat generated by processors.
“I don’t know what the current backlog is (for Nvidia chips), but it’s close to a year, if not a year,” O’Donnell said. “So most of what we’re probably going to make next year is already sold out.”
The hyperscaler is seeking a similar increase in capital spending in 2025 as in 2024, with a large portion of that expected to end up being used to buy Blackwell’s chips.
While Nvidia remains the AI king, there are plenty of challengers to the throne. AMD and Intel (INTC) are the frontrunners for chipmakers, and both have products on the market. AMD’s MI300X chip line is designed to work on Nvidia’s H100 Hopper chips, while Intel’s is powered by Gaudi 3 processors.
But as Intel continues to struggle amid its turnaround efforts and search for a new CEO, AMD is well-positioned to take market share from Nvidia. But even AMD has struggled to break Nvidia’s lead.
“What AMD needs to do is make the software actually easier to use, work with developers to build more in-demand systems, and ultimately be able to generate more sales.” ” Newman said. “Because these cloud providers are trying to sell what their customers want.”
But it’s not just AMD and Intel. Nvidia customers are increasingly developing and promoting their own AI chips. Google has Broadcom-based tensor processing unit chips (TPUs), Amazon (AMZN) has Trainium 2 processors, and Microsoft (MSFT) has Maia 100 accelerators.
There are also concerns that the move to “inferential AI models” will reduce the need for high-performance Nvidia chips.
Technology companies develop AI models by training them on vast amounts of data (also known as the training process). Training requires incredibly powerful tips and a lot of energy. Inference, or actually putting these AI models to work, is less resource and power intensive. As inference becomes a larger part of AI workloads, the thinking is that companies will retreat from the need to buy large quantities of Nvidia chips.
Huang said he is ready for this, explaining at various events that Nvidia’s chips are just as good at inference as they are at training.
Even if Nvidia’s market share declines, that doesn’t necessarily mean Nvidia’s business will be worse than before.
“This is definitely a case that raises all boats,” Newman said. “So even if the competition becomes more intense, which I definitely think it will be, that doesn’t mean they’re going to fail. This is people building a bigger pie.”
Email Daniel Howley at dhowley@yahoofinance.com. Follow @DanielHowley on Twitter.
Click here for the latest technology news impacting the stock market.
Read the latest financial and business news from Yahoo Finance