Meta has expanded its partnership with Broadcom to co-develop multiple generations of its MTIA custom silicon, as the social media group accelerates the build-out of proprietary AI infrastructure to support its ambitions in personalised AI across its apps and services.
MTIA, Meta's custom accelerator family, is designed to handle inference, ranking and recommendation workloads across Facebook, Instagram and WhatsApp, as well as generative AI tasks, offloading those jobs from general-purpose graphics processing units to more cost-efficient dedicated hardware.
Meta said it plans to develop and deploy four new MTIA generations within the next two years, covering ranking, recommendation and generative AI workloads.
Under the expanded agreement, Broadcom will collaborate on chip design, advanced packaging and networking, working from its XPU platform to create custom accelerators optimised for Meta's infrastructure across successive silicon generations.
Broadcom's advanced Ethernet technologies will also underpin high-bandwidth networking across Meta's growing AI compute clusters.
The announcement includes an initial infrastructure commitment exceeding one gigawatt, described as the first phase of a sustained multi-gigawatt rollout, signalling the scale of investment Meta is committing to custom silicon over the coming years.
Mark Zuckerberg, Meta's founder and chief executive, said the company was partnering with Broadcom "across chip design, packaging, and networking to build out the massive computing foundation we need to deliver personal superintelligence to billions of people."
Broadcom president and chief executive Hock Tan said the expanded collaboration would help Meta "pioneer the next frontier of artificial intelligence."
The announcement also confirmed that Tan will step down from Meta's board of directors and move to an advisory role focused on guiding the custom silicon roadmap, a transition that formalises the operational nature of the relationship between the two companies.
Related reading
- Nvidia says the open versus proprietary AI debate is the wrong argument
- From Vera Rubins, Blackwell, Nemotron, NemoClaw, to Isaac and Cosmos - here's a whistlestop guide to NVIDIA's…
- Meta and Arm to co-develop data centre chips built for AI workloads
Meta's approach to AI silicon reflects a portfolio strategy in which different accelerators are matched to specific workload types to optimise both performance and total cost of ownership, rather than relying on a single chip architecture across all tasks.
The company joins a growing list of large technology groups investing heavily in custom silicon to reduce dependence on Nvidia's graphics processing units and gain greater control over the performance and economics of their AI infrastructure.
The recap
- Meta expands Broadcom partnership to co-develop MTIA chips
- Agreement includes an initial commitment that exceeds 1GW
- Meta will deploy 4 MTIA generations within the next 2 years