Samsung HBM4 nears Nvidia approval as rivalry with SK Hynix intensifies for Rubin AI platform

Reviewed byNidhi Govil

8 Sources

Share

Samsung is close to securing Nvidia certification for its HBM4 AI memory chips, entering final qualification phases after supplying initial samples in September. The company prepares for mass production in February, aiming to debut alongside Nvidia's Rubin AI platform at GTC 2026. Meanwhile, SK Hynix reports significant progress in its own HBM4 tests, setting up an intense competition between South Korean rivals.

Samsung enters final HBM4 qualification phase with Nvidia

Samsung Electronics is nearing a critical milestone in its effort to reclaim leadership in AI memory, entering the final qualification phase with Nvidia for its HBM4 chips after supplying initial samples in September

1

. The Suwon, South Korea-based company is preparing for mass production in February 2026, with shipments expected to follow shortly after, though exact timing remains undisclosed

1

. Samsung's HBM4 operates at 11.7Gb/s, significantly exceeding Nvidia's stated requirements of 10Gb/s and surpassing the JEDEC standard of 8Gb/s

4

5

. The high-bandwidth memory passed verification without requiring redesign, even after customers requested performance enhancements, demonstrating the technological completeness of Samsung's design

4

.

Source: TechRadar

Source: TechRadar

Samsung shares gained as much as 3.2% in Seoul following reports of the progress, while SK Hynix stock declined by approximately the same amount

1

. The company has reportedly cleared final quality evaluations for HBM4 from both Nvidia and AMD, positioning itself to supply components for Nvidia's upcoming flagship Rubin processors

4

.

HBM4 integration targets Rubin AI platform debut at GTC 2026

Samsung's HBM4 is already integrated into Nvidia's Rubin demonstration platforms, with units mass-produced in February set to reach Nvidia for performance demonstrations at the March GTC 2026 conference

4

2

. The collaboration emphasizes system-level AI performance rather than optimizing individual components in isolation, with Samsung and Nvidia coordinating memory supply with chip production to reduce timing uncertainty

2

.

The modules incorporate a logic base die produced using Samsung's 4nm process, giving the company greater control over manufacturing and delivery schedules compared to suppliers relying on external foundries

2

. Within Rubin-based servers, HBM4 is paired with high-speed SSD storage to handle large datasets and limit data movement bottlenecks, reflecting attention on memory bandwidth as a primary constraint for advanced AI accelerators

2

. Full-scale supply of HBM4 in large volumes is projected for around June, with early customer shipments expected from August

2

4

.

SK Hynix makes significant progress in qualification tests

While Samsung advances toward certification, SK Hynix has also made significant progress in Nvidia's extensive HBM4 qualification tests for Rubin AI GPUs

3

. Industry sources report that SK Hynix achieved meaningful results in Nvidia's HBM4 System-in-Package testing earlier this month, after starting the Customer Sample certification process in October 2025

3

. When defects were found in some circuits, SK Hynix made modifications and delivered improved HBM4 memory chips to Nvidia, with these optimized products now very close to being ready for mass production

3

.

Source: TweakTown

Source: TweakTown

The new HBM4 memory chips from SK Hynix achieve 10Gbps under general environments and hit 9-10Gbps under Nvidia's rigorous test conditions for temperature, humidity, and impact

3

. An industry official stated that SK Hynix still enjoys Nvidia's trust, with Nvidia allocating a large volume of HBM4 supply to the company, noting that SK Hynix likely met requirements in various evaluation categories beyond just bandwidth

3

. During its Q4 2025 earnings call, SK Hynix stated it aims for an overwhelming market share in HBM4, just as with previous HBM3 and HBM3E generations

3

.

Memory chip manufacturers compete amid AI infrastructure demand

Samsung trails SK Hynix and Micron Technology at the forefront of AI memory, but all three memory chip manufacturers have seen shares surge dramatically in recent weeks as the AI rush produces a shortage of memory for the broader electronics industry

1

. Between them, the three leading companies have gained roughly $900 billion in market value since the start of September

1

. Nvidia has leaned most heavily on SK Hynix for the most sophisticated memory chips it pairs with its top-of-the-line AI accelerators, making Samsung's potential certification a notable shift in the supply chain

1

.

Source: TweakTown

Source: TweakTown

The semiconductor industry indicates that Samsung's memory technology has stabilized with this HBM4 shipment, addressing previous technological gaps with competitors evident during the HBM3 and HBM3E phases

4

. Samsung completely overhauled its HBM and semiconductor division in recent years, with results now materializing

5

. The company uses 6th-generation 10nm-class DRAM and a 4nm logic base die in its HBM4 design, with plans to continue delivering next-generation memory solutions including HBM, GDDR, and SOCAMM memory, as well as foundry services

5

. HBM4 integrates directly into AI accelerators such as Nvidia's Rubin, linking its availability to customers' schedules for next-generation chip mass production, with Samsung adjusting shipment volumes to align with these timelines and specified quantities

4

. With incredibly high bandwidth and energy efficiency, Samsung's advanced HBM solutions are expected to help accelerate the development of future AI applications and form a critical foundation for manufacturing infrastructure driven by these technologies

5

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo