Samsung nears Nvidia approval for HBM4 chips, closing gap with SK Hynix in AI memory race

5 Sources

Share

Samsung has entered the final qualification phase with Nvidia for its next-generation HBM4 AI memory chips, preparing for mass production in February. The Korean tech giant's HBM4 offers speeds up to 11Gbps, exceeding industry standards, and is set to power Nvidia's upcoming Vera Rubin AI platform, marking a significant comeback after trailing rivals SK Hynix and Micron.

Samsung Advances Toward Nvidia Supply Deal with HBM4 Breakthrough

Samsung has made substantial progress in securing certification from Nvidia for its next-generation HBM4 memory chips, entering the final qualification phase after supplying initial samples in September

1

. The Suwon-based company is preparing for mass production of HBM4 in February, positioning itself to join rivals SK Hynix and Micron in supplying components for advanced AI accelerators

1

. Samsung's shares gained as much as 3.2% in Seoul following the news, while SK Hynix's stock declined by a similar margin

1

.

Source: Benzinga

Source: Benzinga

The development represents a critical turning point for Samsung, which has struggled in the high-bandwidth memory market over recent years. According to multiple reports, Samsung has passed all verification stages for both Nvidia and Advanced Micro Devices, with shipments expected to begin as early as next month

3

5

. The Korea Economic Daily reported that Samsung has successfully completed qualification tests, marking an early foray into supplying HBM4 AI memory chips for next-generation platforms

5

.

HBM4 Specifications Exceed Industry Standards for Vera Rubin AI Platform

Samsung's HBM4 modules deliver pin speeds up to 11Gbps, significantly surpassing the JEDEC standard of 8Gbps

2

4

. These enhanced speeds were specifically requested by Nvidia to meet the demanding requirements of its Vera Rubin platform, which CEO Jensen Huang confirmed is already in full production

3

. The company utilizes 6th-generation 10nm-class DRAM technology paired with a 4nm logic base die, providing superior bandwidth and energy efficiency for AI applications

2

.

Source: TweakTown

Source: TweakTown

A key competitive advantage lies in Samsung's internal foundry sourcing for the logic base die, enabling the company to guarantee adequate delivery timings compared to SK Hynix and Micron, which plan to source their logic dies from TSMC

4

. Samsung plans to unveil its HBM4 memory at GTC 2026 in March, where the Vera Rubin AI lineup will be displayed

2

. Customer shipments around Vera Rubin are expected to start from August, with Samsung's modules potentially featured as soon as June

4

.

Market Implications and Competitive Landscape in AI Memory

Samsung trails SK Hynix and Micron at the forefront of AI memory, but investor hopes are rising that the company may join its rivals in supplying components for Nvidia's flagship processors

1

. The three leading memory chip manufacturers have gained roughly $900 billion in market value since early September, driven by an AI-fueled shortage of memory across the broader electronics industry

1

. SK Hynix currently dominates the high-bandwidth memory market and has been Nvidia's primary supplier for HBM3 and HBM3E chips, with the company completing supply negotiations with major customers and expanding production capacity at its M15X fabrication plant in Cheongju

3

.

Source: Wccftech

Source: Wccftech

Samsung has raised prices on its major memory chips by as much as 60% since September 2025, though the company quickly refuted rumors of an 80% price increase across its entire memory product lineup

3

. Both Samsung and SK Hynix are scheduled to report fourth-quarter earnings, where they are expected to discuss progress on their HBM4 chips and provide insights into production timelines

1

. The timing proves critical as demand for HBM chips continues to surge alongside the rapid expansion of generative AI and agentic AI applications, with Nvidia using vast quantities of high-bandwidth memory to enable its AI accelerators

1

4

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo