Micron Pioneers HBM4 Memory: A Leap Forward for AI and High-Performance Computing

6 Sources

Micron has begun shipping samples of its next-generation HBM4 memory to key customers, offering 36 GB capacity and 2 TB/s bandwidth, setting new standards for AI and HPC applications.

Micron's HBM4: A New Frontier in Memory Technology

Micron Technology has announced a significant breakthrough in memory technology with the commencement of shipping samples of its next-generation High Bandwidth Memory 4 (HBM4) to key customers. This development marks a substantial leap forward in the realm of artificial intelligence (AI) and high-performance computing (HPC) 1.

Technical Specifications and Advancements

Source: Guru3D.com

Source: Guru3D.com

The new HBM4 memory modules boast impressive specifications:

  • 36 GB capacity per module
  • 12-High device configuration
  • 2048-bit wide interface
  • Data transfer rate of approximately 7.85 GT/s
  • Peak bandwidth of up to 2 TB/s

These specifications represent a significant improvement over the current-generation HBM3E memory, offering over 60% higher bandwidth and up to 20% better energy efficiency 2.

Manufacturing Process and Integration

Micron's HBM4 is built on the company's 1β (1-beta) DRAM process technology, which has been in use since 2022. The logic base dies are produced by TSMC using its 12FFC+ (2nm-class) or N5 (5nm-class) logic process technology 3.

A notable feature of the HBM4 is its built-in memory test capability, which simplifies integration for partners. This, coupled with Micron's advanced packaging technology, ensures seamless integration for customers developing next-generation AI platforms 4.

Impact on AI and HPC

The expanded interface and high-throughput design of HBM4 are expected to accelerate the inference performance of large language models and chain-of-thought reasoning systems. This advancement is crucial for the growing field of generative AI, where effective management of inference is becoming increasingly important 5.

Industry Adoption and Future Outlook

Source: Benzinga

Source: Benzinga

While Micron is the first to officially start sampling HBM4 memory modules, other manufacturers like Samsung and SK hynix are expected to follow suit. The industry anticipates volume production of HBM4 to commence in 2026, aligning with the production schedules of next-generation AI processors 1.

Nvidia's codenamed Vera Rubin GPUs for datacenters are expected to be among the first products to adopt HBM4 in late 2026. AMD is also likely to integrate HBM4 into its next-generation Instinct MI400 series 2.

Challenges and Future Developments

As Micron moves towards large-scale production, the company will face challenges related to thermal management and proving real-world performance. The increased stack count and bandwidth can generate more heat, making thermal performance a critical factor 2.

Micron's development of HBM4 is part of its broader strategy to accelerate AI innovation. The company plans to introduce an EUV-enhanced 1γ process for DDR5 memory later this year, further expanding its portfolio of AI memory and storage solutions 4.

Explore today's top stories

Google Launches Search Live: AI-Powered Voice Conversations in Search

Google introduces Search Live, an AI-powered feature enabling back-and-forth voice conversations with its search engine, enhancing user interaction and information retrieval.

TechCrunch logoCNET logoThe Verge logo

15 Sources

Technology

1 day ago

Google Launches Search Live: AI-Powered Voice Conversations

Microsoft Plans Massive Layoffs Amid $80 Billion AI Investment Push

Microsoft is set to cut thousands of jobs, primarily in sales, as it shifts focus towards AI investments. The tech giant plans to invest $80 billion in AI infrastructure while restructuring its workforce.

Reuters logoTechSpot logoTechRadar logo

13 Sources

Business and Economy

1 day ago

Microsoft Plans Massive Layoffs Amid $80 Billion AI

Apple Explores Generative AI for Chip Design: A Boost to Silicon Innovation

Apple's senior VP of Hardware Technologies, Johny Srouji, reveals the company's interest in using generative AI to accelerate chip design processes, potentially revolutionizing their approach to custom silicon development.

Tom's Hardware logoReuters logo9to5Mac logo

11 Sources

Technology

16 hrs ago

Apple Explores Generative AI for Chip Design: A Boost to

Midjourney Launches V1: Its First AI Video Generation Model

Midjourney, known for AI image generation, has released its first AI video model, V1, allowing users to create short videos from images. This launch puts Midjourney in competition with other AI video generation tools and raises questions about copyright and pricing.

TechCrunch logoThe Verge logoengadget logo

10 Sources

Technology

1 day ago

Midjourney Launches V1: Its First AI Video Generation Model

AI Reasoning Models Generate Up to 50 Times More COâ‚‚ Emissions Than Concise Models, Study Finds

A new study reveals that AI reasoning models produce significantly higher COâ‚‚ emissions compared to concise models when answering questions, highlighting the environmental impact of advanced AI technologies.

Popular Science logoScienceDaily logoLive Science logo

8 Sources

Technology

8 hrs ago

AI Reasoning Models Generate Up to 50 Times More COâ‚‚
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Twitter logo
Instagram logo
LinkedIn logo