Compute advantage isn't everything when it comes artificial intelligence (AI)-accelerated data centers.
Roughly 30 years ago, the evolution of the internet changed everything for corporate America. Although it took time for businesses to fully realize the potential that e-commerce brought to the table, it proved to be a game-changing innovation.
Since the mid-1990s, an assortment of next-big-thing technologies and trends have come and gone, seemingly all of which had the potential to change the long-term growth trajectory for businesses. Following three decades of trends struggling in the internet's shadow, artificial intelligence (AI) has emerged as the next great leap forward in innovation.
Image source: Getty Images.
With AI, software and systems are tasked with handling jobs that humans would normally undertake or oversee. What gives the AI revolution such eye-popping potential is the capacity for AI-driven software and systems to learn without human intervention and evolve over time. This evolution can include becoming more proficient at assigned tasks, or perhaps learning new skills/jobs.
Although estimates vary wildly, which is perfectly normal for early-stage innovations, the analysts at PwC released a report last year that called for AI to add $15.7 trillion to the global economy by 2030. With an addressable market this massive, it means multiple companies can be big-time winners.
Arguably no company has taken the bull by the horns more than semiconductor goliath Nvidia (NVDA -6.46%).
Nvidia's operating expansion has been virtually flawless
Even factoring in Nvidia's nearly 20% pullback since hitting an all-time intra-day high of more than $140 per share on June 20, it's gained $2.45 trillion in market value in less than 19 months. No market leader has added this much value so quickly, which is likely why the company's board approved a historic 10-for-1 stock split in late May.
Nvidia's data center dominance explains why it's made history.
According to the analysts at TechInsights, Nvidia accounted for 3.76 million of the 3.85 million graphics processing units (GPUs) that were shipped to data centers last year. For those of you keeping score at home, this represents a 98% market share, which is effectively a monopoly on the "brains" powering decision-making in AI-accelerated data centers.
In addition to its first-mover advantages, Nvidia has seemingly set itself up for long-term success by maintaining its compute dominance. Currently, demand for its H100 GPU is off the charts. While competitors are busy trying to outpace the H100, Nvidia is preparing to launch its next-generation GPU platform, known as Blackwell, during the second half of 2024. Blackwell offers accelerated computing capabilities in a half-dozen arenas, including generative AI solutions and quantum computing.
In June, CEO Jensen Huang also teased the potential of its Rubin architecture, which is expected to be released in 2026 and will run on a new processor, known as Vera.
The company's CUDA platform is playing a key role in its success, too. CUDA is the toolkit used by developers to build large language models. Though a lot of emphasis is (rightly) placed on Nvidia's industry-leading hardware in AI data centers, its software is helping to keep customers contained within its ecosystem of products and services.
The final piece of the puzzle for Nvidia has been its exceptional pricing power, which has been fueled by AI-GPU demand demonstrably outpacing supply. When demand for a good or service outstrips supply, it's perfectly normal for its price to head higher. Over the previous five quarters (through April 28, 2024), Nvidia's adjusted gross margin has expanded by close to 14 percentage points to 78.4%.
Image source: Getty Images.
Nvidia is about to run headfirst into a data center "real estate" problem
There's absolutely no question that Wall Street's leading tech companies and most influential businesses are aggressively investing in AI-accelerated data centers. What remains to be seen is if Nvidia continues to win a monopoly-like share of the "real estate" devoted to GPUs in high-compute data centers.
This year marks the first real competition that Nvidia's hardware is going to contend with. Advanced Micro Devices is expanding production of its MI300X GPU, which sells for considerably less per chip than Nvidia's H100. Meanwhile, Intel is also rolling out its Gaudi 3 AI-accelerating GPU on a broader basis in the second half of 2024.
Although AMD's MI300X and Intel's Gaudi 3 offer subtle advantages over Nvidia's H100 GPU, Nvidia's chips are expected to retain their compute advantage.
But here's the thing that investors are overlooking: Compute advantage isn't everything in AI-driven data centers.
Nvidia is currently unable to fulfill the massive demand for its chips. With AMD and Intel entering the picture, impatient buyers are liable to flock to their considerably less expensive hardware. This means less real estate in high-compute data centers for Nvidia to claim.
And it's not just external competitors that Nvidia has to worry about. In fiscal 2024, Microsoft, Meta Platforms, Amazon, and Alphabet collectively accounted for around 40% of Nvidia's net sales. While it's great news that Wall Street's most influential businesses are using Nvidia's chips to run generative AI solutions and train large language models, all four of these companies are also developing AI chips for use in their data centers.
Microsoft's Azure Maia 100, Alphabet's Trillium, Amazon's Trainium2, and the Meta Training and Inference Accelerator are all expected to complement Nvidia's GPUs in each company's respective AI data center. Even with Nvidia maintaining its compute edge, these complementary chips will remove valuable data center real estate from the equation. In plainer English, there's going to be less of a need for Nvidia's AI-GPUs moving forward.
AI-GPU scarcity is the catalyst responsible for driving Nvidia's pricing power into the heavens for more than a year. As new chips flood the market and begin securing valuable data center real estate, it's a virtual given that Nvidia's pricing power, along with its adjusted gross margin, will fade.
I'll also add that most businesses currently lack a clear plan or understanding of how to monetize their AI investments and generate a positive return.
As enterprise data centers evolve and mature, Nvidia is highly likely to lose share. For its investors, this is a recipe for ample downside to come.
Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool's board of directors. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Sean Williams has positions in Alphabet, Amazon, Intel, and Meta Platforms. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Meta Platforms, Microsoft, and Nvidia. The Motley Fool recommends Intel and recommends the following options: long January 2025 $45 calls on Intel, long January 2026 $395 calls on Microsoft, short August 2024 $35 calls on Intel, and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.