Samsung gains ground in HBM4 race as Nvidia production ignites AI memory battle with SK Hynix, Micron

8 Sources

Share

Samsung Electronics says customers praised its HBM4 chips, declaring 'Samsung is back' as the chipmaker fights to reclaim AI memory leadership. Meanwhile, Nvidia's Vera Rubin platform enters full production, triggering an aggressive capacity race among memory suppliers targeting next-generation high-bandwidth memory dominance.

Samsung Electronics Gains Customer Praise for HBM4 Competitiveness

Samsung Electronics has signaled a comeback in the AI memory market after customers praised the differentiated competitiveness of its next-generation high-bandwidth memory chips, known as HBM4. In a New Year address, co-CEO and chip chief Jun Young-hyun revealed that some customers have stated "Samsung is back," marking a critical confidence boost for the South Korean chipmaker

1

. The company's shares rose as much as 3.8% in morning trade following the announcement, outpacing the benchmark KOSPI's 1.1% rise

5

. Samsung confirmed in October that it was in close discussion to supply HBM4 to Nvidia, the U.S. artificial intelligence leader, as it scrambles to catch rivals including compatriot SK Hynix in the AI chip race

1

.

Source: Reuters

Source: Reuters

Nvidia's Vera Rubin Production Ignites AI Memory Battle

When Nvidia CEO Jensen Huang confirmed at CES 2026 that its next-generation AI processor, Vera Rubin, had entered full production, the message to the memory industry was immediate

3

. The move effectively ignited a new competitive cycle in sixth-generation high-bandwidth memory, as memory suppliers race to lock in design wins for Nvidia's post-Blackwell platforms. Samsung, SK Hynix, and Micron are all preparing HBM4 for Vera Rubin, with all three suppliers' products currently under evaluation by Nvidia as supply schedules and deployment timelines are coordinated

3

. Market expectations point to February 2026 as the start of large-scale HBM4 co-supply for Nvidia platforms, marking a critical inflection point for AI memory dominance.

Micron Targets 30% HBM4 Capacity Share in Aggressive Push

Micron Technology is emerging as the most aggressive mover in the HBM4 race, with South Korean industry sources indicating the U.S. memory maker plans to lift HBM4 capacity to 15,000 wafers per month in 2026

3

. Based on Korean securities estimates that Micron's total HBM output is about 55,000 wafers per month, HBM4 would represent roughly 30% of overall capacity, signaling a decisive shift toward next-generation products. During its December 17, 2025, earnings call, Micron CEO Sanjay Mehrotra said the company would begin ramping HBM4 output from the second quarter of 2026, adding that yield improvement is progressing faster than HBM3E

3

. Crucially, Micron has indicated that near-term HBM supply is already fully contracted, with the company finalizing price and volume agreements for upcoming HBM shipments, including early HBM4. Micron's HBM4 exceeds 11 gigabits per second (Gbps), outperforming baseline JEDEC specifications and Nvidia's operating targets while stabilizing yields faster than the previous generation

3

.

Source: DIGITIMES

Source: DIGITIMES

SK Hynix Showcases 48GB HBM4 with Industry-Leading Bandwidth

SK Hynix, which currently dominates the AI memory market, unveiled advanced AI-focused memory solutions at CES 2026, including the ultra-fast 48GB 16-Hi HBM4 with 2TB/sec bandwidth

4

. The company demonstrated its 16-Hi HBM4 48GB modules running at the industry's fastest speed of 11.7Gbps, destined for Nvidia's Vera Rubin AI platform. Data from Counterpoint Research shows that in the third quarter of 2025, SK Hynix held a commanding 53 percent share of the HBM market, followed by Samsung at 35 percent and Micron at 11 percent

2

. However, SK Hynix CEO Kwak Noh-Jung warned employees that 2026's environment would be "tougher than last year," noting that AI growth is now a base assumption rather than a pleasant surprise, and continued "bolder investment and effort" will be required

2

. SK Hynix also showcased new SOCAMM2 memory modules specialized for AI servers and LPDDR6 memory optimized for on-device AI, offering huge data processing speed increases and power efficiency gains

4

.

Source: Korea Times

Source: Korea Times

Foundry Business and Supply Chain Challenges Ahead

Beyond memory, Samsung's foundry business is also gaining momentum. Jun Young-hyun said recent supply deals with major global customers had left the foundry business "primed for a great leap forward"

5

. In July, Samsung Electronics signed a $16.5 billion deal with Tesla

2

. However, Samsung co-CEO TM Roh, who heads the company's device experience division overseeing mobile phone, TV, and home appliance businesses, warned that 2026 was expected to bring greater uncertainty and risks, citing rising component prices and global tariff barriers

5

. Roh emphasized the need for proactive supply chain diversification and optimization of global operations to address issues like component sourcing and pricing.

What This Means for the AI Chip Market

The HBM4 race matters because high-bandwidth memory has become a critical bottleneck in AI system performance. As training and inference workloads scale, faster data transfer rates and improved power efficiency are essential for next-generation AI platforms. For Samsung, regaining customer confidence after lagging behind SK Hynix represents a strategic imperative—losing ground in AI memory could have long-term implications for the chipmaker's position in the broader semiconductor industry. For Micron, the aggressive capacity expansion represents a strategic reversal, pairing scale with its long-standing strength in low-power memory design

3

. Whether Micron's mix of low-power design, early customer lock-in, and aggressive scale-up can truly challenge Samsung and SK Hynix will become clearer as HBM4 volumes ramp through 2026. What is already clear is that in the HBM4 era, capacity is no longer a background variable—it is a frontline competitive weapon

3

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo