Meta locks Broadcom for custom AI chips through 2029 as Hock Tan exits board

Reviewed byNidhi Govil

16 Sources

Share

Meta and Broadcom announced a multi-year partnership extending through 2029 for custom AI accelerators, with initial deployment exceeding 1 gigawatt of computing capacity. Broadcom CEO Hock Tan is stepping down from Meta's board to avoid conflict of interest but will continue advising on Meta's custom silicon roadmap as the social media giant pursues its AI infrastructure goals.

Meta and Broadcom Forge Long-Term AI Chips Partnership

Meta and Broadcom this week unveiled an expanded partnership with Broadcom that extends through 2029, positioning the chipmaker as a critical supplier for the social media giant's AI infrastructure goals

1

. Under the multi-year agreement, Broadcom will supply Meta multiple generations of custom AI accelerators designed specifically for the Meta Training and Inference Accelerator program, commonly known as MTIA

3

. The partnership includes an initial commitment exceeding 1 gigawatt of computing capacity, with plans to scale to multiple gigawatts as Meta rapidly expands its data centers

2

.

Source: SiliconANGLE

Source: SiliconANGLE

The deal represents a significant step in Meta's strategy to reduce reliance on expensive GPUs from Nvidia and AMD by developing in-house AI silicon tailored to specific workloads

5

. Mark Zuckerberg emphasized the scale of this commitment, stating that Meta is "partnering with Broadcom across chip design, packaging, and networking to build out the massive computing foundation we need to deliver personal superintelligence to billions of people"

1

.

Hock Tan Departs Meta's Board Amid Expanded Deal

In a notable development tied to the partnership, Hock Tan announced he will step down from Meta's board of directors, which he joined in early 2024, to avoid a conflict of interest as he remains chief executive of Broadcom

2

. However, Tan will transition into an advisory role where he will continue to guide Meta's custom silicon roadmap and influence its future infrastructure investments

1

. Wolfe Research analyst Chris Caso noted that Tan's decision to step down is significant and suggests the duration of the partnership may extend beyond what was explicitly stated

4

.

Custom Silicon Strategy Accelerates

The custom AI accelerators will be built around Broadcom's foundational XPU platform, which combines custom differentiating silicon with standard logic, memory, and high-speed I/O to improve efficiency and lower costs

1

. Broadcom will also provide Ethernet networking solutions for scale-up, scale-out, and scale-across requirements, addressing Meta's comprehensive AI networking needs

1

. The chips will be the first AI silicon manufactured on a 2-nanometer process, according to Broadcom

5

.

Source: Quartz

Source: Quartz

Meta introduced MTIA in 2023 and unveiled four new versions in March, demonstrating rapid iteration between chip generations

5

. The MTIA accelerators use modified RISC-V-based cores from Andes Technology for scheduling and orchestration, and the scale of this deal will likely benefit both Andes Technology and the broader RISC-V ISA ecosystem

1

.

Source: Tom's Hardware

Source: Tom's Hardware

Wall Street Reacts to Strategic Positioning

Broadcom shares jumped nearly 4% in extended trading following the announcement, with analysts viewing it as further evidence of Broadcom's leading position in AI networking and custom accelerator platforms

4

. Goldman Sachs analyst James Schneider described Broadcom's XPU platform as "industry-leading" and noted that the deal provides a broader and more diverse customer base with exposure to both enterprise and consumer AI

4

. The partnership follows Broadcom's recent expanded deals with Google and Anthropic, positioning the company as a key enabler for hyperscalers pursuing custom chip strategies

4

.

Implications for AI Infrastructure Competition

This silicon strategy sits within Meta's much larger capital plan, with the company announcing it could spend up to $135 billion on AI this year as it competes with Google, Amazon, Anthropic, and OpenAI

5

. Meta plans to deploy this mix of custom accelerators alongside GPUs from Nvidia and AMD across 31 data centers, including 27 in the US

5

. Unlike Google and Amazon, which expose their custom accelerators through cloud platforms, Meta uses its MTIA silicon entirely for internal workloads to power features across its platforms

5

. The shared roadmap to co-design and scale hardware aims at delivering real-time generative AI features to billions of users [3](https://www.reuters.com/business/meta-inks-deal-with-broadcom-custom-ai-chips-2026-04-14/].

Today's Top Stories

TheOutpost.ai

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Instagram logo
LinkedIn logo
Youtube logo
© 2026 TheOutpost.AI All rights reserved