Google’s TPUs: A Decade-Long Investment Fueling Their AI Dominance

Nvidia dominates the AI chip market, but Google is emerging as a silicon contender with its Tensor Processing Units (TPUs). Google’s Ironwood, its seventh-generation TPU, delivers a fourfold performance increase and is targeted at demanding AI workloads. AI startup Anthropic plans to deploy 1 million Ironwood TPUs. Google’s TPUs offer efficiency advantages and drive cloud growth. While Amazon and Microsoft are developing custom chips, Google leads in TPU deployment at scale, with potential for significant cloud market impact. Google is even exploring space-based solar power for TPUs.

“`html
Google's TPUs: A Decade-Long Investment Fueling Their AI Dominance

Sopa Images | Lightrocket | Getty Images

Nvidia has cemented its position as the dominant force in the artificial intelligence chip market, commanding a $4.5 trillion market capitalization by supplying silicon to virtually every major technology firm.

Google, a significant consumer of Nvidia’s technology, relies heavily on the chipmaker’s graphics processing units (GPUs) to address the accelerating need for AI processing capacity in the cloud.

While Google’s reliance on Nvidia GPUs isn’t expected to diminish anytime soon, the tech giant is showcasing its dual role as a consumer and a developer of high-performance silicon.

Google recently unveiled its most advanced chip to date, Ironwood, slated for broad availability in the coming weeks. Ironwood represents the seventh generation of Google’s Tensor Processing Unit (TPU), a custom silicon project that has been underway for over a decade.

TPUs are application-specific integrated circuits (ASICs) that deliver highly specialized and efficient hardware for particular tasks in AI. Google emphasizes that Ironwood is engineered to tackle the most demanding AI workloads, encompassing the training of extensive models and the operation of real-time chatbots and AI agents, boasting a performance increase of over four times its predecessor. AI startup Anthropic intends to deploy up to 1 million Ironwood TPUs to power its Claude model.

For Google, TPUs provide a distinct advantage amidst the hyper-competitive landscape of hyperscale data center construction, where the production rate of AI processors struggles to keep pace with market demand. While other cloud providers are pursuing similar strategies, their efforts lag behind Google’s.

Amazon Web Services (AWS) released its initial cloud AI chip, Inferentia, to its customers in 2019, followed by Trainium three years later. Microsoft unveiled its custom AI chip, Maia, only recently, at the end of 2023.

“Among ASIC players, Google stands out as the only one that has truly deployed this technology at scale,” commented Stacy Rasgon, a semiconductor analyst at Bernstein. “For other major players, it entails a prolonged effort involving substantial resources and capital. They are the furthest along among the other hyperscalers.”

Google declined to comment for this story.

Originally designed for internal workloads, Google’s TPUs have been accessible to cloud customers since 2018. Nvidia has demonstrated concern over this development. Following OpenAI’s initial cloud agreement with Google earlier this year, Nvidia CEO Jensen Huang initiated additional discussions with the AI startup and its CEO, Sam Altman, according to reports.

Unlike Nvidia, Google doesn’t market its chips as standalone hardware. Instead, it provides access to TPUs as a service through its cloud platform, which has become a significant driver of the company’s growth. Alphabet, Google’s parent company, reported a 34% year-over-year increase in cloud revenue to $15.15 billion in its recent third-quarter earnings report, surpassing analyst expectations. The company closed the quarter with a business backlog of $155 billion.

“We are observing substantial demand for our AI infrastructure products, encompassing both TPU-based and GPU-based solutions,” stated CEO Sundar Pichai during the earnings call. “This demand has been a key factor in our growth trajectory over the past year, and we anticipate continued strong demand moving forward, prompting ongoing investment to meet it.”

Google does not provide a detailed breakdown of its TPU business within its cloud segment. However, analysts at D.A. Davidson estimated in September that a “standalone” business comprising TPUs and Google’s DeepMind AI division could be valued at approximately $900 billion, a significant increase from their January estimate of $717 billion. Alphabet’s current market capitalization exceeds $3.4 trillion.

‘Tightly targeted’ chips

Customization serves as a primary differentiator for Google. One critical advantage highlighted by analysts is the enhanced efficiency TPUs provide to customers in comparison to alternative products and services.

“They are truly crafting chips that are precisely tailored for their anticipated workloads,” noted James Sanders, an analyst at Tech Insights.

Rasgon articulated that efficiency will become increasingly critical, as the extensive infrastructure build-out suggests that “the likely bottleneck is not chip supply, but power.” He emphasizes that advancements in chip design and cooling technologies are crucial to overcome these limitations.

On Tuesday, Google announced Project Suncatcher, an initiative exploring “how an interconnected network of solar-powered satellites, equipped with our Tensor Processing Unit (TPU) AI chips, could harness the full power of the Sun.” This ambitious project aims to leverage space-based solar energy to power computationally intensive AI workloads, potentially reducing reliance on terrestrial resources.

As part of the project, Google intends to launch two prototype solar-powered satellites carrying TPUs by early 2027.

“This approach holds tremendous potential for scalability while minimizing the impact on terrestrial resources,” the company stated. “It will enable us to test our hardware in orbit, laying the groundwork for a future era of massively scaled computation in space.”

Dario Amodei, co-founder and chief executive officer of Anthropic, at the World Economic Forum in 2025.

Stefan Wermuth | Bloomberg | Getty Images

Google’s largest TPU deal to date materialized last month, with the announcement of a substantial expansion of its agreement with OpenAI rival Anthropic, valued in the tens of billions of dollars. Through this partnership, Google is projected to bring well over a gigawatt of AI processing capacity online in 2026.

“Anthropic’s decision to significantly increase its utilization of TPUs reflects the strong price-performance and efficiency its teams have experienced with TPUs over several years,” Google Cloud CEO Thomas Kurian stated at the time of the announcement.

Google has invested $3 billion in Anthropic. While Amazon remains Anthropic’s primary cloud partner, Google is now offering the foundational infrastructure to support the next generation of Claude models.

“The demand for our models is so significant that a multi-chip strategy is essential to meeting our capacity demands,” Mike Krieger, Chief Product Officer at Anthropic, told CNBC.

Anthropic’s strategy incorporates TPUs, Amazon Trainium, and Nvidia GPUs, to optimize for cost, performance, and redundancy. Krieger noted that Anthropic invested heavily upfront to ensure seamless interoperability across these different silicon providers.

“That investment is paying off now that we’re able to come online with these massive data centers and meet customers where they are,” Krieger said.

Hefty spending is coming

Prior to the Anthropic deal, Google established a six-year cloud agreement with Meta, worth more than $10 billion, although the precise allocation of TPU utilization within the arrangement remains unclear. While OpenAI has indicated that it will utilize Google’s cloud as part of its diversification away from Microsoft, the company has stated it is planning to use GPUs.

Alphabet CFO Anat Ashkenazi attributed Google’s cloud momentum in the latest quarter to escalating enterprise demand for Google’s comprehensive AI stack. The company reported that it secured more billion-dollar cloud deals in the first nine months of 2025 than in the preceding two years combined.

“Within GCP, we are witnessing robust demand for enterprise AI infrastructure, including TPUs and GPUs,” Ashkenazi stated, adding that users are also leveraging the company’s latest Gemini offerings, as well as services “such as cybersecurity and data analytics.”

Amazon, which recently reported 20% growth in its market-leading cloud infrastructure business, echoes a similar sentiment.

AWS CEO Matt Garman conveyed that the company’s Trainium chip series is gaining momentum. He shared that “every Trainium 2 chip we land in our data centers today is getting sold and used,” and he assured continued performance gains and efficiency improvements with Trainium 3, highlighting Amazon’s commitment to custom silicon solutions.

Shareholders appear willing to accept substantial investments in cloud infrastructure and AI development.

Google has adjusted its capital expenditures forecast for the year to $93 billion, an increase from the prior guidance of $85 billion, with a projected increase in 2026. The stock price grew 38% in the third quarter, the best performance for the period in two decades, and is currently up 17% in the fourth quarter.

Mizuho emphasizes Google’s distinct cost and performance advantages with TPUs, noting that while the chips were initially intended for internal use, Google is now attracting external customers and supporting larger workloads.

Morgan Stanley analysts suggested that while Nvidia’s GPUs are likely to remain the dominant chip provider in AI, the increasing developer familiarity with TPUs could become a significant factor in driving Google Cloud growth.

Analysts at D.A. Davidson suggest that Google should consider offering TPUs “externally to customers,” including frontier AI labs, due to immense demand.

“We continue to believe that Google’s TPUs remain the best alternative to Nvidia, with the gap between the two closing significantly over the past 9-12 months,” they wrote. “During this time, we’ve seen growing positive sentiment around TPUs.”

“`

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/12499.html

Like (0)
Previous 6 days ago
Next 5 days ago

Related News