Meta Expands Chip Partnership with Broadcom to Realize AI Ambitions
Social media giant Meta is significantly expanding its collaboration with chip designer Broadcom to produce custom AI processors for multiple generations. This extended partnership, running through 2029, includes a commitment of over 1GW of computing power and aims to accelerate AI development across Meta's applications, reducing reliance on NVIDIA.
📋 Article Processing Timeline
- 📰 Published: April 15, 2026 at 10:30
- 🔍 Collected: April 15, 2026 at 11:01 (30 min after Published)
- 🤖 AI Analyzed: April 15, 2026 at 17:15 (6h 13m after Collected)
Social media giant Meta will partner with chip designer Broadcom Inc. to produce custom artificial intelligence (AI) processors for several generations. This expanded collaboration, announced today, extends the partnership through 2029 and includes an initial commitment of over 1GW (Gigawatt, one million kilowatts) of computing power, enough to power approximately 750,000 average US homes. As part of the agreement, Broadcom CEO Hock Tan will step down from Meta's board of directors to serve in an advisory role for custom chip strategy, the companies said in a joint statement. As demand for AI-driven computing surges, major technology companies such as Meta, Google, and Amazon are designing their own chips to reduce reliance on NVIDIA's expensive processors. This trend of custom chip development has made Broadcom one of the biggest beneficiaries of generative AI. Broadcom collaborates with clients to develop custom processors and offers infrastructure software. Meta CEO Mark Zuckerberg said the collaboration will help "build the massive compute foundation we need to deliver personalized superintelligence to billions of people." Meta, which unveiled the blueprint for four new chips just last month, stated that the initial computing power commitment with Broadcom is the "first phase of a sustained, multi-GW scale deployment." Broadcom's Ethernet technology will also be used to connect Meta's rapidly growing AI computer clusters. The first chip in Meta's AI Training and Inference Accelerator (MTIA) program, named MTIA 300, is already powering Meta's ranking and recommendation systems, with three more generations expected by 2027. Subsequent generations of chips are designed for "inference," the process by which AI systems generate answers and perform tasks for users. (Compiled by: Lee Pei-shan) 2026-04-15