SK Hynix Unveils World's First HBM4 Technology, Pushing Boundaries in AI Memory Solutions

2 Sources

Share

SK Hynix showcases groundbreaking HBM4 memory technology at TSMC's North American Technology Symposium, featuring 16-Hi stacks and 2TB/sec bandwidth, positioning itself as a leader in AI memory solutions.

News article

SK Hynix Leads the Charge in HBM4 Technology

SK Hynix, a South Korean memory leader, has unveiled the world's first HBM4 (High Bandwidth Memory) technology at TSMC's North American Technology Symposium, showcasing significant advancements in memory solutions for AI and data centers

1

2

. This breakthrough positions SK Hynix at the forefront of AI memory innovation, potentially reshaping the landscape of high-performance computing.

HBM4 Specifications and Performance

The new HBM4 technology boasts impressive specifications:

  • Capacity: Up to 48GB
  • Bandwidth: 2TB/sec per stack
  • I/O Speed: 8Gbps

SK Hynix demonstrated both 12-Hi and 16-Hi stacks, with the latter representing the pinnacle of current memory stacking technology

1

. The company has achieved this high number of layers through the use of Advanced MR-MUF and TSV (Through-Silicon Via) technologies

2

.

Production Timeline and Market Impact

SK Hynix has announced plans for mass production of HBM4 in the second half of 2025, with integration into AI GPUs expected by the end of this year

1

. This aggressive timeline suggests that HBM4-powered AI GPUs could flood the market by 2026, potentially revolutionizing AI computing capabilities.

NVIDIA Partnership and Future Applications

The new HBM4 memory chips are slated to be incorporated into NVIDIA's upcoming GB300 "Blackwell Ultra" AI GPUs

1

2

. NVIDIA is planning a full transition to HBM4 memory with its next-generation Vera Rubin architecture, highlighting the critical role of SK Hynix's technology in future AI hardware developments.

Additional Memory Solutions

Alongside HBM4, SK Hynix showcased other memory products at the symposium:

  1. HBM3E: A 16-layer implementation with 1.2TB/s bandwidth, targeted for NVIDIA's GB300 "Blackwell Ultra" AI clusters

    2

    .

  2. Server Memory Modules: New RDIMM and MRDIMM products based on the latest 1c DRAM standard, achieving speeds up to 12,500MB/sec

    1

    2

    .

  3. AI and Data Center Modules: A range of modules designed to enhance performance while reducing power consumption, including:

    • MRDIMM: 12.8Gbps speed, capacities of 64GB, 96GB, and 256GB
    • RDIMM: 8Gbps speed, capacities of 64GB and 96GB
    • 3DS RDIMM: 256GB capacity

      1

      2

Industry Implications

SK Hynix's advancements in HBM4 and other memory technologies demonstrate its growing dominance in the memory market. The company appears to be outpacing competitors like Micron and Samsung, particularly in HBM4 development

2

. This leadership position could have significant implications for the AI and high-performance computing industries, potentially influencing future hardware designs and capabilities.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo