SK hynix Leads the Charge in Next-Gen AI Memory with World's First 12-Layer HBM4 Samples

5 Sources

Share

SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.

News article

SK hynix Unveils Groundbreaking 12-Layer HBM4 Memory

In a significant leap forward for AI memory technology, SK hynix has announced the successful sampling of the world's first 12-layer High Bandwidth Memory 4 (HBM4) to major customers

1

. This development marks a crucial milestone in the ongoing race to enhance AI acceleration capabilities, with SK hynix solidifying its position as a leader in the AI memory market.

Unprecedented Performance and Capacity

The new 12-layer HBM4 samples boast impressive specifications that set them apart in the industry:

  • Bandwidth: Capable of processing over 2 terabytes (TB) of data per second, equivalent to processing more than 400 full-HD movies in just one second

    2

    .
  • Speed Improvement: More than 60% faster than the previous generation HBM3E

    3

    .
  • Capacity: Achieves an industry-leading 36GB capacity, the highest among 12-layer HBM products

    4

    .

Advanced Manufacturing Process

SK hynix has implemented its Advanced MR-MUF (Mass Reflow-Molded Underfill) process in the production of HBM4. This innovative technique:

  • Prevents chip warpage
  • Maximizes product stability
  • Improves heat dissipation

    2

Accelerated Timeline and Market Impact

The company has expedited its HBM4 development and production schedule:

  • Samples delivered ahead of the initial plan
  • Mass production preparations targeted for the second half of 2025
  • Original mass production was set for 2026, but timeline moved up by approximately six months

    1

Industry Collaboration and Future Prospects

SK hynix's HBM4 development is closely tied to the needs of major players in the AI industry:

  • NVIDIA is expected to use HBM4 in its upcoming Rubin series of GPUs

    3

    .
  • The company aims to strengthen its position in the next-generation AI memory market

    2

    .

Broader Industry Trends

The development of HBM4 is part of a larger trend in the memory industry:

  • Other major players like Samsung and Micron are also working on HBM4 solutions

    5

    .
  • Future iterations like HBM4e are already being planned, with potential for even higher capacities and speeds

    5

    .

As AI workloads continue to demand more from memory systems, the race to develop faster and higher-capacity solutions like HBM4 is likely to intensify, driving innovation in the semiconductor industry and enabling more powerful AI applications in the coming years.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo