Semiconductor company Nvidia (NVDA -1.29%) has seen its share price increase nearly 1,000% since the launch of ChatGPT in late 2022. That event was the big-bang moment for the artificial intelligence (AI) boom, and the subsequent surge in Nvidia shares reflects its critical position in the burgeoning AI economy.
Specifically, Nvidia dominates the market for data center graphics processing units (GPUs) and accounted for 98% of shipments last year. GPUs speed up complex workloads, like training AI models, and running AI applications. Consequently, Nvidia has secured a monopoly-like market share in AI accelerators.
To that end, Nvidia bears frequently highlight custom AI chips from big technology companies like Amazon (AMZN -0.45%) and Alphabet (GOOGL 1.63%) (GOOG 1.67%) as cause for alarm. While both companies have designed custom silicon for their data centers, recent commentary should put Nvidia investors at ease.
Nvidia shareholders have little to fear from Amazon’s custom AI chips
Amazon Web Services (AWS) is the largest public cloud. It accounted for 31% of cloud infrastructure and platform-services spending in the third quarter, which is nearly as much market share as Microsoft Azure and Alphabet’s Google Cloud Platform combined. To better monetize demand for artificial intelligence, AWS has developed two custom AI chips.
Specifically, AWS Trainium is designed for training machine learning models, and AWS Inferentia is purpose-built to accelerate inference workloads. But AWS recently provided reassuring context for Nvidia shareholders. “We want to be absolutely the best place to run Nvidia,” said Vice President Dave Brown. “At the same time, we think it’s healthy to have an alternative.”
While AWS is clearly trying to grab market share, the attempt is half-hearted because the company is also leaning into its relationship with Nvidia. For instance, AWS was the first major cloud provider to offer Nvidia H200 GPUs, and management rarely mentions its Trainium or Inferentia chips without also highlighting the importance of its Nvidia partnership.
Amazon CEO Andy Jassy made that clear on the latest earnings call. “We have a very deep partnership with Nvidia. We tend to be their lead partner on most of their new chips,” he told analysts. “I expect us to have a partnership for a very long time.”
Nvidia shareholders have little to fear from Google’s custom AI chips
Alphabet’s Google Cloud Platform is the third-largest public cloud. It accounted for 13% of cloud infrastructure and platform-services spending in the third quarter. But the company is well-positioned to gain market share due to long-running investments in AI. Forrester Research recently recognized Google as a leader in AI infrastructure solutions.
Importantly, Google has developed custom AI accelerators called tensor processing units (TPUs) for the past decade, and the chips have been deployed in its data centers since 2015. Even so, Nvidia GPUs remain the gold standard. In fact, several analysts estimate the company has as much as 80% to 95% market share in AI accelerators. That means Google has yet to dent Nvidia’s dominance, despite trying for a decade.
Meanwhile, Google Cloud is simultaneously leaning into its relationship with Nvidia. Alphabet CEO Sundar Pichai made the following comment during the third-quarter earnings call: “We have a wonderful partnership with Nvidia. We’re excited for the GB200s, and we’ll be one of the first to provide it at scale.” For context, GB200s refer to a supercomputing chip that combines Grace CPUs and Blackwell GPUs, both designed by Nvidia.
Nvidia has key competitive advantages in software and vertical integration
Ultimately, potential competitors like Amazon and Alphabet’s Google will run into a major problem. Nvidia not only builds faster AI accelerators, but also has created an unparalleled ecosystem of software development tools. The company has spent the better part of two decades adding code libraries and pre-trained models to its CUDA platform.
The CUDA software platform lets programmers write GPU-accelerated applications across domains that range from autonomous machines to scientific simulation. Competitors will need to overcome that hurdle to have any chance of challenging Nvidia’s dominance in the AI accelerator market, and several analysts see that outcome as highly improbable, at least for the foreseeable future.
For instance, Joseph Moore at Morgan Stanley recently wrote, “In general, the market tends to underestimate the difficulty of competing with Nvidia, especially as they have moved to an annual product cadence.” Similarly, Toshiya Hari at Goldman Sachs recently wrote, “We believe Nvidia will remain the de facto industry standard for the foreseeable future given its competitive advantage that spans hardware and software capabilities.”
Beyond software, Nvidia has another important competitive advantage in its vertical integration. The company complements its GPUs with adjacent data center hardware, including CPUs, interconnects, and networking equipment. That comprehensive approach lets Nvidia build data center systems with the “lowest total cost of ownership,” according to CEO Jensen Huang.
Here’s the bottom line: At the present time, Nvidia shareholders have little to fear from custom chips designed by Amazon and Alphabet’s Google. Instead, the company is well-positioned to maintain its market leadership in AI accelerators.
Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Trevor Jennewine has positions in Amazon and Nvidia. The Motley Fool has positions in and recommends Alphabet, Amazon, Goldman Sachs Group, Microsoft, and Nvidia. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.