Samsung and AMD have signed a memorandum of understanding to deepen their collaboration on artificial intelligence memory and computing, with Samsung's Pyeongtaek facility in South Korea serving as the primary production site.
The agreement covers two distinct product lines.
Samsung will supply HBM4, the latest generation of high-bandwidth memory used in AI training systems, for AMD's next Instinct MI455X graphics processing unit, and advanced DDR5 memory optimised for AMD's sixth-generation EPYC server processors, codenamed Venice, which are designed for the company's Helios rack-scale architecture.
HBM4, or high-bandwidth memory fourth generation, is a type of chip stacking technology that enables data to move between memory and processor at very high speeds, a critical requirement for the large matrix calculations involved in training and running AI models.
Samsung described its HBM4 as the industry's first such product entering mass production, built on a 10-nanometer class manufacturing process with a 4-nanometer logic base die, capable of processing speeds of up to 13 gigabits per second and bandwidth of up to 3.3 terabytes per second.
Lisa Su, chief executive of AMD, said integration across the full computing stack from silicon to rack was essential to delivering AI performance at scale.
The two companies also said they would discuss foundry partnership opportunities, meaning Samsung could potentially manufacture future AMD chip designs at its own fabrication plants.
Related reading
- From Vera Rubins, Blackwell, Nemotron, NemoClaw, to Isaac and Cosmos - here's a whistlestop guide to NVIDIA's…
- Nordic Semiconductor expands 'ultra low power' AI chip lineup
- WIRobotics selected for Physical AI Fellowship cohort
The announcement builds on nearly two decades of prior collaboration between the pair, including Samsung's supply of HBM3E memory for AMD's existing Instinct MI350X and MI355X accelerators.
The deal arrives at a moment of intense competition in AI memory, with Samsung racing to close the gap on SK Hynix, which currently dominates high-bandwidth memory supply to Nvidia.
The recap
- Samsung and AMD sign MOU to expand AI memory collaboration
- Samsung HBM4 supports up to 13 Gbps and 3.3 TB/s
- Companies to align HBM4 supply and DDR5 for Helios platform