Roughly three decades ago, the advent of the internet changed everything for corporate America. Although it took many years for the internet to mature as a technology, and for companies to realize its full potential, it's positively altered the growth trajectory for businesses and opened new doors that previously hadn't existed.
Since the mid-1990s, Wall Street has been waiting for a new technology or game-changing innovation to replicate what the internet did for businesses. Artificial intelligence (AI) appears to have answered the call.
Start Your Mornings Smarter! Wake up with Breakfast news in your inbox every market day. Sign Up For Free ?
What makes AI such an intriguing technology is its broad-reaching utility. AI-driven software and systems have the capacity grow smarter and become more efficient at their assigned tasks over time. What's more, software and systems may be able to learn new tasks without human intervention.
To some, the sky is truly the limit for artificial intelligence. In Sizing the Prize, the analysts at PwC predict AI will increase global gross domestic product by $15.7 trillion come 2030, with these gains coming from a combination of increased productivity and consumption-side effects.
A $15.7 trillion addressable market leaves plenty of room for multiple companies to be winners. But over the past two years, there's been no clearer beneficiary of the AI revolution than semiconductor colossus Nvidia(NASDAQ: NVDA), which vaulted from a $360 billion market cap to end 2022 to $3.56 trillion, as of the closing bell on Nov. 11.
Nvidia's operating expansion has been virtually flawless
Investors don't have to look far to discover why Nvidia has become Wall Street's most-valuable publicly traded company. Its graphics processing units (GPUs) have become the preferred choice of businesses operating AI-accelerated data centers. Nvidia's H100 GPU, commonly known as the "Hopper," and successor Blackwell GPU architecture, are effectively the brains that allow for split-second decision-making by AI software and systems.
Based on a study from the analysts at TechInsights, Nvidia accounted for roughly 98% of the 2.67 million GPUs shipped to data centers in 2022, as well as the 3.85 million GPUs shipped in 2023. It's unlikely to lose its near-monopoly market share this year, either, with orders for the Hopper chip and Blackwell backlogged.
This is a good time to mention that when demand for a good or service outstrips supply, the price of that good or service often rises. Nvidia has been able to command $30,000 to $40,000 for the Hopper, which is 100% to 300% more than what competing AI-GPUs cost. Businesses have been willing to pay this premium given the computing advantages offered by Nvidia's hardware, and it's translated into a sizable uptick in the company's gross margin.
The company's CUDA platform also deserves credit for its virtually flawless operating expansion. CUDA is the software toolkit developers use to get the most out of their GPUs, including building large language models (LLMs). It's playing a key role in keeping Nvidia's clients loyal to the company's ecosystem of solutions.
Lastly, investors have been mesmerized by never-before-seen growth rates from a market-leading company. When fiscal 2023 came to a close on Jan. 29, 2023, Nvidia reported $27 billion in full-year sales. But based on Wall Street's forecast for fiscal 2026, Nvidia is on track to deliver $180 billion in revenue. That's a three-year compound annual sales growth rate of about 88%!
Yet in spite of its nearly textbook operating expansion, one undeniable threat looms large for Nvidia -- and pretty much no one is talking about it.
This threat has the potential to knock Wall Street's AI darling off of its pedestal
As most investors are probably well aware, the jaw-dropping addressable market for artificial intelligence has businesses clamoring for their piece of the pie. In the coming quarters, Nvidia is going to face no shortage of external competition.
Even though Advanced Micro Devices(NASDAQ: AMD) entered the scene after Nvidia, it's had no trouble ramping up production for its MI300X AI-GPU. AMD also recently introduced its next-gen chip, the MI325X, which is expected to go into production before the end of this year. AMD's chips have a decisively lower price point than Nvidia's Hopper and Blackwell, which may make them attractive for businesses wanting to leverage AI in their data centers.
However, external competition isn't the greatest threat to dethrone Nvidia. Rather, the biggest immediate headwind for Nvidia is internal competition.
Many of America's most-influential businesses have ordered and are using Nvidia's GPUs in their high-compute data centers. This includes Microsoft(NASDAQ: MSFT), Meta Platforms(NASDAQ: META), Amazon(NASDAQ: AMZN), Alphabet(NASDAQ: GOOGL)(NASDAQ: GOOG), Tesla, OpenAI, and AI infrastructure kingpin Super Micro Computer, which incorporates Nvidia's GPUs into its customizable rack servers.
For the three months ended July 28, 2024, four of Nvidia's customers -- it doesn't disclose the name of its top customers -- accounted for 46% of its net sales, with a fifth contributing less than 10%, but still likely in the high single digits. We can comfortably say that half of Nvidia's revenue comes from just five customers. If I had to venture a guess, I'd expect to these clients to be Microsoft, Meta, Amazon, Alphabet, and Super Micro (not in this order).
The concern for Nvidia is that most of these top customers are developing AI-GPUs for use in their respective data centers.
Microsoft developed the Azure Maia AI chip, which can be used to train large language models and oversee generative AI solutions within the company's Azure cloud infrastructure platform.
Meta is working internally on developing the "Meta Training and Inference Accelerator" chip to maximize the potential of its AI data center.
Amazon is developing a number of GPU chips, including Trainium2 and Inferentia, which will cater to clients of its world-leading cloud infrastructure service platform, Amazon Web Services (AWS).
Alphabet is relying on tensor processing units (TPUs) for training LLMs and inference. TPUs are application-specific integrated circuits custom-built for AI applications.
While there might be various aspects of these internally developed chips that outperform Nvidia's Hopper and/or Blackwell, the latter's hardware should remain decisively superior with regard to its computing capabilities. However, this may not be enough to keep Nvidia from losing valuable data center real estate in the quarters to come.
Microsoft, Meta, Amazon, and Alphabet have very deep pockets and abundant operating cash flow. While they can all afford Nvidia's premium price point for its AI-GPUs, it doesn't mean they're satisfied paying a considerably higher price point for this hardware. The order backlog associated with Nvidia's chips, coupled with the ease and cost of using internally developed AI-GPUs, will likely cause Nvidia to lose data center space in the future.
Remember, AI-GPU scarcity has been the leading catalyst that's driven Nvidia's exceptional pricing power. With its top customers developing hardware of their own, it's only a matter of time before this scarcity wanes and Nvidia's margins weaken.
Internal competition is, by far, the biggest threat to Nvidia's AI dominance.
Should you invest $1,000 in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $896,358!*
Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. TheStock Advisorservice has more than quadrupled the return of S&P 500 since 2002*.
John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool's board of directors. Sean Williams has positions in Alphabet, Amazon, and Meta Platforms. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Meta Platforms, Microsoft, Nvidia, and Tesla. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.