Meta’s TPU deal talks push Google’s stock up and challenge Nvidia’s AI chip dominance
The discussions reveal that Nvidia’s largest customers are exploring alternatives, signaling a potential shift in the AI hardware landscape.
For months, you could sense a subtle tension around Nvidia’s relationship with the tech giants that rely on its chips. The company’s biggest customers keep expanding their GPU orders, yet several of them have been simultaneously developing their own in-house chips to reduce dependency on Nvidia.
This week, that background tension finally stepped into full view when The Information reported that Meta is in talks with Google to spend billions on Google’s tensor processing units (TPUs) for its data centres from 2027.
The market reacted immediately to this news. Nvidia’s shares fell more than 2%, Alphabet climbed over 4% as investors saw Meta’s interest in TPUs as validation of Google’s chip strategy. Meanwhile, AMD dropped more than 4%, reflecting concerns that a Meta–Google deal could intensify competition for its AI accelerators.
A single report shifting the stock prices of Nvidia, Alphabet, and AMD at the same time showed just how significant the development was. Meta, which plans up to $72 billion in infrastructure spending this year, considering Google’s TPUs introduced a new dynamic into a market that has long revolved around Nvidia.
Nvidia addresses the situation publicly
Shortly after the Meta–Google report gained traction, Nvidia posted a message on X that directly addressed the attention surrounding its position in the AI hardware market. The company congratulated Google, writing: “We’re delighted by Google’s success, they’ve made great advances in AI and we continue to supply to Google.”
It then reinforced its own stance, stating: “NVIDIA is a generation ahead of the industry, it’s the only platform that runs every AI model and does it everywhere computing is done.”
Nvidia also highlighted a technical distinction. Its GPUs are general-purpose processors capable of supporting a wide range of AI workloads, while Google’s TPUs are ASIC chips built for more specialised functions. The company described its hardware as offering greater “performance, versatility and fungibility,” signalling that Nvidia’s strength lies in flexibility and an ecosystem designed for broad adoption.
This was the company's clearest public acknowledgement of the growing hardware alternatives surrounding it.
Why Google’s TPUs are getting serious attention
Google’s TPUs have been gaining momentum in ways that are hard to ignore. Gemini 3, its newest AI model, was trained entirely on TPUs and received strong reviews on release, adding credibility to Google’s chip ecosystem. Anthropic also expanded its earlier deal with Google, signalling plans to use up to one million of the company’s chips. OpenAI reportedly tested Google’s hardware over the summer. Together, these developments placed TPUs firmly in conversations that used to be dominated by Nvidia alone.
And investors have noticed too. Research firm DA Davidson estimated in September that Google’s TPU business and DeepMind could be valued at around $900 billion. The firm also noted that several leading AI labs had shown “considerable interest” in purchasing TPUs outright, which adds weight to the timing of the Meta discussions.
The trend extends beyond Google. Amazon already operates its Trainium and Inferentia chips and recently rented out half a million of them to Anthropic. Microsoft introduced its Maia processor for AI workloads. These companies still purchase large volumes of Nvidia GPUs, but they’re also building in alternatives that reduce dependence on a single supplier.
At the same time, Nvidia has been navigating a period of scrutiny. Investor Michael Burry, known for his role in predicting the 2008 financial crisis, publicly placed a bet against the company, comparing the AI market to the dot-com era. To dispel this sort of narrative, during the weekend, Nvidia issued a memo stressing transparency, strong fundamentals, and distancing itself from historical accounting scandals like Enron or WorldCom. The company argued that its strategic investments are healthy, with portfolio companies generating real revenue and demand.

Nvidia still commands more than 90% of the AI chip market, reinforced by its CUDA ecosystem used by millions of developers worldwide. None of that disappears quickly. But the events of this week made something clear: Google, Amazon and Microsoft are shaping their own hardware futures, and Meta exploring TPUs places Google directly in Nvidia’s competitive lane.
AI infrastructure is entering a phase where multiple hardware paths can coexist, and the companies that once relied heavily on Nvidia are now creating real alternatives. The Meta–Google discussions revealed how quickly the foundation of AI computing can shift when the biggest players decide to widen their options.


