Next-Gen HBM Memory Race Heats Up: SK Hynix and Micron Prepare for HBM3E and HBM4 Production

3 Sources

SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.

News article

SK Hynix Prepares for HBM3E Mass Production

SK Hynix, a leading player in the High Bandwidth Memory (HBM) market, is ramping up preparations for the mass production of its innovative 16-Hi HBM3E memory. This move comes shortly after the company unveiled the world's first 48GB version of this technology at the SK AI Summit in Seoul 1.

According to industry sources, SK Hynix is currently integrating new equipment and optimizing existing facilities for 16-Hi HBM3E memory production. Production tests are already underway, with supply expected to begin in the first half of 2025. An industry insider reported, "It is the initial line preparation process for 16-layer HBM3E mass production. I understand that the results of the major process tests are also coming out well" 1.

Micron's HBM4 and HBM4E Development

While SK Hynix focuses on HBM3E, Micron is making strides in the development of next-generation HBM4 and HBM4E technologies. Sanjay Mehrotra, President and CEO of Micron, stated, "Leveraging the strong foundation and continued investments in proven 1β process technology, we expect Micron's HBM4 will maintain time to market and power efficiency leadership while boosting performance by over 50% over HBM3E" 2.

Micron anticipates HBM4 to ramp up in high volume for the industry by 2026. The company is also working on HBM4E, which promises to introduce a paradigm shift in the memory business. HBM4E will offer customization capabilities, allowing for improved financial performance for Micron 3.

Technological Advancements and Industry Impact

HBM4 represents a significant leap forward in memory technology. It is expected to stack up to 16 DRAM dies, each with a capacity of 32 GB, along with a 2048-bit wide interface. This configuration makes HBM4 substantially superior to its predecessors 3.

The development of HBM4 also involves integrating memory and logic semiconductors into a single package, eliminating the need for separate packaging technology. This approach is expected to yield significant performance improvements. Micron plans to use TSMC as their logic semiconductor supplier for this purpose 3.

Market Demand and Future Outlook

The demand for HBM is at an all-time high, primarily driven by the AI industry's insatiable appetite for high-performance memory. NVIDIA, a major player in AI GPUs, is reportedly the primary consumer of HBM memory chips. The company's CEO, Jensen Huang, has even requested SK Hynix to accelerate the development of HBM4 memory by six months 1.

Looking ahead, HBM4 is expected to be featured in NVIDIA's Rubin AI architecture and AMD's Instinct MI400 lineup, ensuring widespread market adoption. With production lines booked through 2025 and beyond, the future of HBM technology appears promising for manufacturers and consumers alike 3.

Explore today's top stories

Google Introduces AI-Powered Business Calling and Enhanced AI Mode in Search

Google rolls out an AI-powered business calling feature in the US and enhances its AI Mode with Gemini 2.5 Pro and Deep Search capabilities, revolutionizing how users interact with local businesses and conduct online research.

TechCrunch logoThe Verge logoPC Magazine logo

13 Sources

Technology

1 day ago

Google Introduces AI-Powered Business Calling and Enhanced

Nvidia's AI Chip Sales to China Resume Amid US-China Rare Earth Trade Negotiations

Nvidia and AMD are set to resume sales of AI chips to China as part of a broader US-China trade deal involving rare earth elements, sparking debates on national security and technological competition.

TechCrunch logopcgamer logoEconomic Times logo

3 Sources

Policy and Regulation

9 hrs ago

Nvidia's AI Chip Sales to China Resume Amid US-China Rare

Inside OpenAI: Former Engineer Reveals Chaotic Culture of Secrecy, Rapid Growth, and Innovation

Calvin French-Owen, a former OpenAI engineer, shares insights into the company's internal workings, highlighting its rapid growth, secretive nature, and innovative yet chaotic work environment.

PC Magazine logoGizmodo logoFuturism logo

5 Sources

Technology

1 day ago

Inside OpenAI: Former Engineer Reveals Chaotic Culture of

OpenAI Expands Cloud Partnerships, Adds Google Cloud to Meet Growing AI Compute Demands

OpenAI has added Google Cloud to its list of cloud providers, joining Microsoft, Oracle, and CoreWeave. This move aims to meet the escalating demand for computing capacity needed to run AI models like ChatGPT.

Reuters logoCNBC logoTechRadar logo

7 Sources

Technology

17 hrs ago

OpenAI Expands Cloud Partnerships, Adds Google Cloud to

Nvidia's H20 AI Chip Ban Lifted: Countering China's AI Influence and Black Market Challenges

The U.S. eases restrictions on Nvidia's H20 AI chip sales to China, aiming to counter Huawei's growing influence. Meanwhile, a thriving black market for banned AI chips poses challenges to export controls.

Quartz logoWccftech logo

2 Sources

Technology

9 hrs ago

Nvidia's H20 AI Chip Ban Lifted: Countering China's AI
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo