This Tech Giant Challenges Nvidia: Is the Artificial Intelligence (AI) Market About to Make a Sharp Turn?

In This Article:

AI android black robot head
AI android black robot head

When OpenAI released the ChatGPT chatbot about a year ago, that act unleashed a new zeitgeist in an unsuspecting world. Artificial intelligence (AI) is still the talk of the town, of Wall Street, and of Silicon Valley, and that's especially true for the generative AI technology that powers systems like ChatGPT.

Investors soon figured out that Nvidia (NASDAQ: NVDA) provided the AI acceleration hardware that made it all possible, with spectacular results. Nvidia's stock price has more than tripled in 2023, lifted by actual sales of AI-specific processors and the expectation of continued dominance in this red-hot market segment.

But Nvidia shouldn't rest on its laurels. It isn't the only chip designer in the market, and certainly not the only one with an interest in that lucrative AI opportunity.

The latest challenger to enter the ring and dispute Nvidia's AI mastery is Samsung Electronics (OTC: SSNL.F). The Korean tech titan has partnered up with Naver (OTC: NHNC.F) -- an online entertainment giant from the same country -- to develop both hardware and software that's supposed to match or beat the best tools available today.

Specifically, Samsung and Naver claim that the upcoming AI chip will be eight times as energy-efficient as Nvidia's H100 accelerator.

That's not the same thing as a straight-up performance record -- but a more power-efficient solution may actually pose an even greater threat to Nvidia's throne. Here's why.

The efficiency edge in AI computing

In the realm of high-performance AI computing, efficiency is key. Pure performance doesn't really matter, because you can always throw more hardware at the number-crunching problem.

The supercomputers that train ChatGPT-style AI systems are equipped with thousands of Nvidia A100 accelerators with nearly 7,000 processing cores each. The real challenge is to supply enough power to run these beasts and then cool down the resulting space heater. The OpenAI/Nvidia system draws 7.4 megawatts at full power, comparable to a packed cruise ship crossing the seas or a large steel mill.

Therefore, the AI giants are really looking for a power-sipping solution that can deliver better results per watt.

Samsung and Naver's claim of an AI chip that is eight times more energy-efficient than Nvidia's H100 could represent a paradigm shift. In a world increasingly conscious of energy consumption and cost, a more efficient chip doesn't just mean lower power bills; it means a smaller carbon footprint, a more compact physical installation, and the ability to deploy more powerful AI systems without prohibitive energy costs.