Next-Gen HBM Memory Race Heats Up: SK Hynix and Micron Prepare for HBM3E and HBM4 Production

3 Sources

Share

SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.

News article

SK Hynix Prepares for HBM3E Mass Production

SK Hynix, a leading player in the High Bandwidth Memory (HBM) market, is ramping up preparations for the mass production of its innovative 16-Hi HBM3E memory. This move comes shortly after the company unveiled the world's first 48GB version of this technology at the SK AI Summit in Seoul

1

.

According to industry sources, SK Hynix is currently integrating new equipment and optimizing existing facilities for 16-Hi HBM3E memory production. Production tests are already underway, with supply expected to begin in the first half of 2025. An industry insider reported, "It is the initial line preparation process for 16-layer HBM3E mass production. I understand that the results of the major process tests are also coming out well"

1

.

Micron's HBM4 and HBM4E Development

While SK Hynix focuses on HBM3E, Micron is making strides in the development of next-generation HBM4 and HBM4E technologies. Sanjay Mehrotra, President and CEO of Micron, stated, "Leveraging the strong foundation and continued investments in proven 1β process technology, we expect Micron's HBM4 will maintain time to market and power efficiency leadership while boosting performance by over 50% over HBM3E"

2

.

Micron anticipates HBM4 to ramp up in high volume for the industry by 2026. The company is also working on HBM4E, which promises to introduce a paradigm shift in the memory business. HBM4E will offer customization capabilities, allowing for improved financial performance for Micron

3

.

Technological Advancements and Industry Impact

HBM4 represents a significant leap forward in memory technology. It is expected to stack up to 16 DRAM dies, each with a capacity of 32 GB, along with a 2048-bit wide interface. This configuration makes HBM4 substantially superior to its predecessors

3

.

The development of HBM4 also involves integrating memory and logic semiconductors into a single package, eliminating the need for separate packaging technology. This approach is expected to yield significant performance improvements. Micron plans to use TSMC as their logic semiconductor supplier for this purpose

3

.

Market Demand and Future Outlook

The demand for HBM is at an all-time high, primarily driven by the AI industry's insatiable appetite for high-performance memory. NVIDIA, a major player in AI GPUs, is reportedly the primary consumer of HBM memory chips. The company's CEO, Jensen Huang, has even requested SK Hynix to accelerate the development of HBM4 memory by six months

1

.

Looking ahead, HBM4 is expected to be featured in NVIDIA's Rubin AI architecture and AMD's Instinct MI400 lineup, ensuring widespread market adoption. With production lines booked through 2025 and beyond, the future of HBM technology appears promising for manufacturers and consumers alike

3

.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo