According to an August post by Forbes magazine, Nvidia’s shares fell last month after the company predicted slower revenue growth.
AI chip designers control more than 80% of the fast-growing chip market, so Nvidia’s slowing growth from 122% in the second quarter to a forecast of 80% in the third quarter is raising questions for investors.
Is the AI chip market slowing? Are rivals, attracted by the size and growth of the market and the company’s high profitability (71.8% gross margin in the second quarter), eating away at Nvidia’s market share? Or will Nvidia accelerate its growth by ironing out the kinks in its new chips, developers evaluating them, and building features that rivals can’t replicate?
The short answer is yes, no, and maybe.
Growth in the AI chip market is slowing. Gartner predicted that the AI chip industry will grow 33% in 2024. Between 2024 and 2029, AI chip revenue is expected to grow more slowly, at an annual rate of 20.4%, to about $312 billion, according to MarketsandMarkets. Rivals aren’t taking market share from Nvidia. Nvidia is gaining market share in the GPU market. Between the fourth quarter of 2023 and the first quarter of 2024, the company’s market share rose from 80% to 88%, according to a Jon Peddie Research report featured in a Forbes article in June. During the same period, AMD’s GPU market share fell from 19% to 12%, and Intel’s share fell from 1% to “negligible,” the report said. Nvidia’s growth may accelerate. Nvidia is facing manufacturing difficulties with its latest chip, called Blackwell. Meanwhile, a competing product design from startup Cerebras Systems Inc. claims to be faster and cheaper than Nvidia’s.If Nvidia can solve Blackwell’s technical problems and keep releasing powerful new chips every year, the company’s overwhelming software advantage could lead to faster revenue growth.
Nvidia shares could rise if the company repeats the triple-digit growth rates it achieved in recent quarters. If investors believe the AI chip designer’s growth is likely to continue to slow, now could be a good time to sell the stock.
Why Nvidia’s growth will accelerate
Two driving factors, especially the combination of them, could fuel Nvidia’s growth: an increasing rate of demand growth for AI chips and Nvidia’s success with new chips that offer customers a much higher return on investment than previous versions.
Increased demand for AI chips depends on the emergence of a killer app for generative AI
Growth in demand for AI chips hinges on whether a killer app for generative AI emerges: Companies initially feared missing out on the promise of AI chatbots, but are increasingly fearful of being sued if the technology turns out to be hallucinatory.
That has made companies hesitant to deploy AI investments. “Boards of directors are polarized,” Yoav Shoham, co-founder of AI21, an AI software maker that competes with OpenAI, said in an August interview.
Of the 200 to 300 generative AI experiments being conducted by a typical large company, only 10 to 15 are deployed in-house, and perhaps one or two are released to customers, according to a June interview with Liran Hasson, CEO of Aporia, a startup that provides guardrails to protect businesses from AI hallucinations.
To unlock further demand, a generative AI killer app needs to emerge — an app that gives many people an overwhelming desire to use the new technology because it saves them time and money.For example, the killer app for the personal computer was the electronic spreadsheet, while the killer app for the iPod was the iTunes store.
Nvidia says its chips deliver the industry’s best return on investment
Customers are enjoying a quick return on their investment in the company’s chips. “People who invest in NVIDIA infrastructure are seeing immediate benefits,” NVIDIA CEO Jensen Huang said in a conference call with analysts on Aug. 29. “This is the highest ROI infrastructure, computing infrastructure investment you can make today.”
What’s Nvidia’s return on investment? According to my new book, Brain Rush, despite charging higher prices than its competitors, their chips offer better performance, lower operational costs and the lowest total cost of ownership, which more than offset the higher prices.
Demand for Nvidia’s Blackwell chips has been “incredible,” Huang said in a press release on Aug. 29. “Data centers around the world are working hard to modernize their entire computing stack with accelerated computing and generative AI,” he added.
Nvidia said it shipped samples of Blackwell chips during the quarter and made product changes to improve manufacturing efficiencies. “We expect to ship several billion dollars of Blackwell sales in the fourth quarter,” Nvidia CFO Colette Kress said in a statement on Aug. 29.
Why Nvidia is making higher profit margins than its rivals
Compared to AMD and Intel, Nvidia is growing faster and making more profits. Nvidia’s latest growth rate (122% in Q2) outpaced AMD (8.9%) and Intel (-1%). Nvidia’s gross margin of 71.8% was also higher than rivals AMD (49%) and Intel (35%), according to Yahoo! Finance.
Nvidia’s relatively high profit margins are a result of high prices and strong demand. “I was once at a meeting at an Nvidia competitor, and the chief financial officer said to a room full of C-level executives, ‘If your product is so good, why is GM’s valuation so low?'” Peddie wrote to me in an Aug. 30 email.
“And the answer, which no one dared to say, was, ‘We don’t think we’re as good as them, so our differentiation is price,'” he added.
For Nvidia to outgrow its competitors and beat them in revenue, it needs two key capabilities:
A series of improved AI chip designs. Nvidia continues to develop new chip designs that outperform competitors. “They develop the full stack and only use external memory vendors to build their systems,” Peddie wrote in his email. “They have a large team of the best engineers in the industry. They develop and publish papers in time for patents. They also invest billions of dollars in R&D. Their R&D budgets are larger than startup revenue projections, which is a headwind that is very difficult or impossible for startups to overcome,” he added. Great software for Nvidia developers. Brain Rush points out that no other chip company has software that rivals Nvidia’s CUDA. “Nvd works with customers, and has more customers than all others combined,” Peddie wrote in his email. “These customers develop their programs on Nvd chips, and most of them share developments with Nvd as a way to influence next-generation designs. So not only does NVD have tens of thousands of software and hardware engineers (probably 5:1 software engineers), more than all of its competitors except Intel, it also has a ghost force of thousands more at universities and customer sites,” he added.
Why Nvidia’s revenue could slow
Despite these advantages, Nvidia’s revenue growth has been noticeably slowing. Blackwell’s technology problems could be contributing to the slowdown. If rivals can offer faster, lower-cost chips, programmers could end up writing software for them, potentially costing Nvidia market share.
Product issues hampering Nvidia’s Blackwell chips
According to the Wall Street Journal, NVIDIA said its gross margin declines in the first and second quarters of 2024 were primarily due to product issues. Specifically, the company’s second-quarter gross margin was 75.1%, down 3.3 percentage points from the previous quarter.
Nvidia hasn’t disclosed details about the product issues, but analysts and industry executives say the large size of Blackwell’s chips posed manufacturing challenges because Nvidia’s design “required to combine two advanced new Nvidia processors and numerous memory components into a delicate mesh of silicon, metal and plastic,” The Wall Street Journal noted.
The chip’s complexity increases the chances of it going wrong: A defective component or heat could cause the entire chip to fail, lowering the percentage of the $40,000 Blackwell chips that come out of the manufacturing process that are usable, The Wall Street Journal wrote.
Nvidia has plenty of experience dealing with such challenges. Technical challenges include “contention issues — it’s all about speed, and moving data from one part to another creates timing challenges,” Peddie’s email said.
“You spend countless hours in simulators trying to find and fix all the defects, but once the silicon is made, new defects will appear. NVD (and other semiconductor companies) know this and have workarounds built into their designs, building compensation and redundancy into the design in case something goes wrong,” he added.
It will take time to see whether Nvidia can overcome these technical challenges.
Startup Cerberas Systems says its chips are faster and cheaper than Nvidia’s.
Cerebras Systems is a Sunnyvale, Calif.-based, 430-employee chip-making startup with a 2021 private valuation of $4.2 billion (according to PitchBook) that competes with Nvidia. It takes a different approach to designing chips like Blackwell’s, which allows its chips to process information faster and more cheaply than Nvidia’s.
“Doing meaningful work with AI requires a huge amount of computation, which translates into many more transistors than you can fit on a single chip,” Cerebras founder Andrew Feldman told The Wall Street Journal. “Developing the technology for two chips is hard, developing the technology for four chips is even harder, developing the technology for eight is even harder,” Feldman added.
Cerebras, whose clients include AstraZeneca and the Mayo Clinic, “has figured out how to connect and operate as one giant chip,” The Wall Street Journal noted. Additionally, the startup last week launched a cloud-computing service for AI inferencing (i.e., responding to user questions in natural language) after training AI chatbots.
Celebrus, which privately filed for an IPO in April and could go public in October 2024, claims its inference service runs “20 times faster and at a fraction of the cost” of competing chips, according to AI News.
According to AI News, Cerebras claims that by using its Wafer-Scale Engine, its service is 20 times faster (achieving speeds of 1,800 output tokens per second) than typical hyperscale cloud offerings that use NVIDIA GPUs. Additionally, Cerberas-powered services are “more cost-effective,” sources told AI News.
Celebrus’ faster speeds and lower cost may not be enough to challenge NVIDIA
While large enterprises may choose Celebrus to save time and money, smaller companies are more likely to stick with Nvidia. “The key question is whether companies are willing to align their engineering processes with Celebrus’ system,” David Nicholson, an analyst at Futurum Group, told AI News.
Nicholson noted that because Nvidia offers an established solution, smaller companies that don’t have the capital of larger companies may find it worthwhile to change their engineering processes.
While such a system may take years to fully function, in the future, generative AI may act as an agent, for example allowing consumers to choose an itinerary and delegating the process of searching flights and purchasing the best fit to an AI agent, Brain Rush explained.
AI News noted that Cerebras’ technology has “16-bit precision and faster inference capabilities,” making it ideal for powering AI agents that “must operate quickly, repeatedly, and in real time.”
If Nvidia concludes that Cerebras makes a better mousetrap, the AI chip leader could build something like the startup’s WSE. “If a giant chip like Cerebras is the best answer, you’d think Nvidia could and would build it. What can a startup do that Nvidia doesn’t do or that could be easily replicated,” Peddie concluded.
Celebrus, which runs its own data centers to offer its inference services, is aiming to win over Microsoft and Amazon as customers, Bloomberg noted.Can Celebrus take market share from Nvidia? “It’s enough to piss them off,” Feldman told Bloomberg.