46% of Nvidia's Revenue Came From 4 Mystery Customers Last Quarter

In This Article:

Nvidia (NASDAQ: NVDA) had a market capitalization of $360 billion at the start of 2023. Less than two years later, it's now worth over $3.4 trillion. Although the company supplies graphics processing units (GPUs) for personal computers and even cars, the data center segment has been the primary source of its growth over that period.

Nvidia's data center GPUs are the most powerful in the industry for developing and deploying artificial intelligence (AI) models. The company is struggling to keep up with demand from AI start-ups and the world's largest technology giants. While that's a great thing, there is a potential risk beneath the surface.

Start Your Mornings Smarter! Wake up with Breakfast news in your inbox every market day. Sign Up For Free ?

Nvidia's financial results for its fiscal 2025 second quarter (which ended on July 28) showed that the company increasingly relies on a small handful of customers to generate sales. Here's why that could lead to vulnerabilities in the future.

GPU ownership is a rich company's game

According to a study by McKinsey and Company, 72% of organizations worldwide are using AI in at least one business function. That number continues to grow, but most companies don't have the financial resources (or the expertise) to build their own AI infrastructure. After all, one of Nvidia's leading GPUs can cost up to $40,000, and it often takes thousands of them to train an AI model.

Instead, tech giants like Microsoft (NASDAQ: MSFT), Amazon (NASDAQ: AMZN), and Alphabet (NASDAQ: GOOG) (NASDAQ: GOOGL) buy hundreds of thousands of GPUs and cluster them inside centralized data centers. Businesses can rent that computing capacity to deploy AI into their operations for a fraction of the cost of building their own infrastructure.

Cloud companies like DigitalOcean are now making AI accessible to even the smallest businesses using that same strategy. DigitalOcean allows developers to access clusters of between one and eight Nvidia H100 GPUs, enough for very basic AI workloads.

Affordability is improving. Nvidia's new Blackwell-based GB200 GPU systems are capable of performing AI inference at 30 times the pace of the older H100 systems. Each individual GB200 GPU is expected to sell for between $30,000 and $40,000, which is roughly the same price as the H100 when it was first released, so Blackwell offers an incredible improvement in cost efficiency.

That means the most advanced, trillion-parameter large language models (LLMs) -- which have previously only been developed by well-resourced tech giants and leading AI start-ups like OpenAI and Anthropic -- will be financially accessible to a broader number of developers. Still, it could be years before GPU prices fall enough that the average business can maintain its own AI infrastructure.