Can Jensen Huang maintain Nvidia’s grip on AI as competitors rise amid geopolitical tensions?
Amazon, Google, and China are testing the limits of Nvidia’s grip on AI.

Nvidia has a chokehold on the AI GPU market, controlling 94% of it as of Q2 2025 — up 2% from the previous quarter. It’s also projected to generate $49 billion in AI-related revenue this year alone, nearly 40% higher than last year, which is a staggering increase by any measure.
If that’s not impressive enough, Nvidia recently hit a $4.6 trillion market cap, making it the first company ever to cross the $4 trillion mark. Microsoft followed in July 2025, becoming the second to reach the same milestone.
Despite powering the overwhelming majority of today’s AI with its GPUs, Nvidia’s dominance is apparent in more subtle ways, too. Both Google and Amazon reportedly gave CEO Jensen Huang a courtesy call to brief him on their own chip plans before going public, showing just how much influence the company still holds across the industry.
Why Amazon and Google still bow to Nvidia
It’s an unusual practice to let what could eventually become your competition know about your plans, but I guess chivalry isn’t dead. Both Amazon and Google still rely heavily on Nvidia’s GPUs for most of their AI workloads, according to Tom's Hardware. However, both companies have reportedly informed the giant of their plans for custom chips.
CUDA, Nvidia’s custom toolkit for building and running AI models, makes this reliance even stronger. Once developers build around CUDA, switching to something like Google’s (tensor processing unit) TPUs or AMD’s GPUs isn’t straightforward. It would mean rewriting huge amounts of software from scratch, creating massive costs and delays. That reality almost secures Nvidia’s position in the AI space for years to come.
Letting Nvidia know ahead of time about new chips doesn’t erase that risk, but it softens the blow. With Nvidia still dominating the market, keeping a good relationship with Jensen Huang and his company is probably the safest move for now, and whilst no specific chip details were shared, it does show a keen interest to move away from Nvidia as the sole provider in the AI space.
Geopolitics and competition — China’s ban and rival chips
It feels like an AI race every week lately, and even Nvidia’s CEO has admitted China is only “nanoseconds” behind. That race just got sharper, with China’s Cyberspace Administration ordering companies like ByteDance and Alibaba to stop buying and testing Nvidia’s RTX Pro 6000D chips.
All the latest news, reviews, and guides for Windows and Xbox diehards.
The order amounts to a full ban. These chips were designed specifically to comply with U.S. export rules. Still, regulators in China say domestic processors from Huawei and Cambricon now match, or even beat, what Nvidia can offer in that category.
However, despite this ban, it hasn’t stopped Alibaba from announcing a partnership with Nvidia.
It still could be a setback for Nvidia, though perhaps not a shocking one, given the ongoing tension between the U.S. and China over trade and tariffs. Still, China had been seen as a $50 billion market opportunity — and losing that kind of growth potential is no small thing.
Competitors gaining ground?
Whilst no specific chips have been named in the report, we can still look at what Amazon and Google have been working on, or at the very least what is publicly available knowledge.
Amazon’s Trainium 3 is a custom chip revealed in December 2024, designed for training large AI models, built in partnership with Anthropic to create supercomputers that don’t rely on Nvidia. It’s an impressive partnership, too, as Anthropic is one of the companies competing with OpenAI and has also recently partnered with Microsoft to bring its Claude model to Microsoft’s suite of Office apps. Amazon has gone even further here, investing a total of $8 billion in Anthropic and securing it as a core partner, with Anthropic now calling AWS its primary cloud and training platform.
Amazon controls AWS, the world’s largest cloud provider, meaning it can push Trainium 3 or whatever’s next onto customers for cheaper at a massive scale.
On the Google side of things, its TPUs have been in development for nearly a decade and are now in their seventh generation, called Ironwood. The latest version brings big leaps in both training and inference performance, offering better speeds and performance than earlier TPUs. But again, it comes back to developers being locked into CUDA. Despite the strength of Ironwood, adoption outside of Google is still limited, with many hesitant to move away from Nvidia’s ecosystem.
Follow Windows Central on Google News to keep our latest news, insights, and features at the top of your feeds!

Adam is a Psychology Master’s graduate passionate about gaming, community building, and digital engagement. A lifelong Xbox fan since 2001, he started with Halo: Combat Evolved and remains an avid achievement hunter. Over the years, he has engaged with several Discord communities, helping them get established and grow. Gaming has always been more than a hobby for Adam—it’s where he’s met many friends, taken on new challenges, and connected with communities that share his passion.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.