
Broadcom CEO Hock Tan.
Meta and Broadcom have solidified their strategic alliance with a significant extension of their partnership focused on the development of Meta’s proprietary in-house AI accelerators. This new agreement, running through 2029, underscores the critical role custom silicon plays in the hyperscalers’ aggressive pursuit of advanced artificial intelligence capabilities.
Adding a notable development to this expanded collaboration, Broadcom CEO Hock Tan announced his decision not to seek reelection to Meta’s board, a move disclosed by Meta last week. Tan, who joined Meta’s board in 2024, will depart after his current term, signaling a sharpened focus on his leadership at Broadcom amidst increasingly complex semiconductor supply chains and evolving industry demands.
The cornerstone of this renewed partnership is Meta’s commitment to deploy an initial 1 gigawatt of its Training and Inference Accelerators (MTIA). This substantial deployment signifies a significant step-up in Meta’s internal AI infrastructure, with the agreement paving the way for the eventual deployment of multiple gigawatts of chips powered by Broadcom’s advanced technology. This scale is essential for powering the computationally intensive workloads associated with training and deploying large-scale AI models.
Crucially, the next generation of Meta’s MTIA chips will leverage a cutting-edge 2-nanometer process technology, as highlighted by Broadcom. This advanced node is a critical enabler for achieving higher performance, improved power efficiency, and greater transistor density – all paramount for the demands of modern AI workloads. The ability to manufacture on such advanced process nodes is a testament to the deep collaboration and innovation between Meta and Broadcom.
Mark Zuckerberg, Meta’s co-founder and CEO, emphasized the comprehensive nature of the partnership, stating, “Meta is partnering with Broadcom across chip design, packaging, and networking to build out the massive computing foundation we need to deliver personal superintelligence to billions of people.” This statement points to a holistic approach, encompassing not only the core chip architecture but also the intricate processes of chip packaging and the high-speed networking required to interconnect these powerful AI systems. The ambition is to create a robust and scalable infrastructure capable of supporting a new era of AI-driven services.
Following the announcement, Broadcom shares saw a positive reaction, climbing 3% in after-hours trading. Meta’s stock remained relatively flat, reflecting the market’s existing expectations regarding its AI ambitions.
Addressing potential market speculation, Tan reiterated on Broadcom’s March earnings call, “Contrary to recent analyst reports, Meta’s custom accelerator, MTIA roadmap is alive and well. We’re shipping now and, in fact, for the next generation XPUs, we will scale to multiple gigawatts in 2027 and beyond.” This statement serves to reassure investors and the market about the continued viability and planned expansion of Meta’s custom AI silicon strategy.
Meta has been actively developing its own AI silicon, unveiling four new iterations of its MTIA chips in March. This initiative, first introduced in 2023, mirrors similar custom chip development programs at other major tech giants like Google and Amazon, highlighting a broader industry trend towards vertical integration in AI hardware. These custom ASICs (Application-Specific Integrated Circuits) are designed to be more power-efficient and cost-effective for specific AI tasks compared to general-purpose GPUs.
The hyperscale computing landscape is currently characterized by a significant demand for alternatives to the high-cost and supply-constrained graphics processing units (GPUs) from Nvidia and AMD. To meet the insatiable appetite of AI data centers, companies are increasingly investing in the design and deployment of their own ASICs. While these specialized chips offer tailored performance for specific AI workloads, they typically trade off the broad applicability of GPUs for optimized efficiency.
Google pioneered the custom ASIC race with its Tensor Processing Unit (TPU) in 2015, followed by Amazon’s custom chip announcements in 2018. Unlike Google and Amazon, which integrate their AI chips into their respective cloud computing platforms for customer access, Meta’s MTIA chips are primarily intended for internal use, fueling its vast social media and metaverse initiatives.
This extended partnership with Meta comes on the heels of Broadcom’s recent announcement of a long-term agreement with Google for the production of its TPUs, and a commitment to provide Anthropic with 3.5 gigawatts of these chips. This demonstrates Broadcom’s growing importance as a key manufacturing and development partner for leading AI players across the ecosystem.
Broadcom’s strategic positioning in the AI hardware market has been reflected in its stock performance. Year-to-date in 2026, Broadcom shares have surged by approximately 10%, significantly outperforming the broader S&P 500 index, which has seen a gain of around 2% over the same period.
In an unrelated board change, Tracey Travis, who retired as Estée Lauder’s finance chief last year, will be stepping down from Meta’s board. Travis had joined Meta’s board in 2020.
Meta’s aggressive investment in AI hardware, including this deal with Broadcom, is part of its broader strategy. Since announcing a commitment to spend up to $135 billion on AI this year in January, Meta has been actively securing significant AI computing resources. This includes substantial commitments for AMD GPUs (up to 6 gigawatts), millions of Nvidia chips, and new custom silicon from ARM Holdings, signaling a diversified approach to building its AI infrastructure and staying competitive against rivals and emerging AI labs.
To support this ambitious AI build-out, Meta is expanding its data center footprint, with plans for 31 new facilities, 27 of which will be located within the United States. This expansion is critical for housing the immense computational power required for its AI-driven future.
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/20646.html