Micron Pioneers HBM4 Memory: A Leap Forward for AI and High-Performance Computing

6 Sources

Micron has begun shipping samples of its next-generation HBM4 memory to key customers, offering 36 GB capacity and 2 TB/s bandwidth, setting new standards for AI and HPC applications.

Micron's HBM4: A New Frontier in Memory Technology

Micron Technology has announced a significant breakthrough in memory technology with the commencement of shipping samples of its next-generation High Bandwidth Memory 4 (HBM4) to key customers. This development marks a substantial leap forward in the realm of artificial intelligence (AI) and high-performance computing (HPC) 1.

Technical Specifications and Advancements

Source: Guru3D.com

Source: Guru3D.com

The new HBM4 memory modules boast impressive specifications:

  • 36 GB capacity per module
  • 12-High device configuration
  • 2048-bit wide interface
  • Data transfer rate of approximately 7.85 GT/s
  • Peak bandwidth of up to 2 TB/s

These specifications represent a significant improvement over the current-generation HBM3E memory, offering over 60% higher bandwidth and up to 20% better energy efficiency 2.

Manufacturing Process and Integration

Micron's HBM4 is built on the company's 1β (1-beta) DRAM process technology, which has been in use since 2022. The logic base dies are produced by TSMC using its 12FFC+ (2nm-class) or N5 (5nm-class) logic process technology 3.

A notable feature of the HBM4 is its built-in memory test capability, which simplifies integration for partners. This, coupled with Micron's advanced packaging technology, ensures seamless integration for customers developing next-generation AI platforms 4.

Impact on AI and HPC

The expanded interface and high-throughput design of HBM4 are expected to accelerate the inference performance of large language models and chain-of-thought reasoning systems. This advancement is crucial for the growing field of generative AI, where effective management of inference is becoming increasingly important 5.

Industry Adoption and Future Outlook

Source: Benzinga

Source: Benzinga

While Micron is the first to officially start sampling HBM4 memory modules, other manufacturers like Samsung and SK hynix are expected to follow suit. The industry anticipates volume production of HBM4 to commence in 2026, aligning with the production schedules of next-generation AI processors 1.

Nvidia's codenamed Vera Rubin GPUs for datacenters are expected to be among the first products to adopt HBM4 in late 2026. AMD is also likely to integrate HBM4 into its next-generation Instinct MI400 series 2.

Challenges and Future Developments

As Micron moves towards large-scale production, the company will face challenges related to thermal management and proving real-world performance. The increased stack count and bandwidth can generate more heat, making thermal performance a critical factor 2.

Micron's development of HBM4 is part of its broader strategy to accelerate AI innovation. The company plans to introduce an EUV-enhanced 1γ process for DDR5 memory later this year, further expanding its portfolio of AI memory and storage solutions 4.

Explore today's top stories

NASA and IBM Unveil Surya: An AI Model for Predicting Solar Weather

NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather, potentially improving the protection of Earth's critical infrastructure from solar storms.

New Scientist logoengadget logoGizmodo logo

5 Sources

Technology

1 hr ago

NASA and IBM Unveil Surya: An AI Model for Predicting Solar

Meta Launches AI-Powered Voice Translation for Facebook and Instagram Creators

Meta introduces an AI-driven voice translation feature for Facebook and Instagram creators, enabling automatic dubbing of content from English to Spanish and vice versa, with plans for future language expansions.

TechCrunch logoCNET logoThe Verge logo

8 Sources

Technology

17 hrs ago

Meta Launches AI-Powered Voice Translation for Facebook and

OpenAI's GPT-6: Revolutionizing AI with Memory and Personalization

OpenAI CEO Sam Altman reveals plans for GPT-6, focusing on memory capabilities to create more personalized and adaptive AI interactions. The upcoming model aims to remember user preferences and conversations, potentially transforming the relationship between humans and AI.

CNBC logoTom's Guide logo

2 Sources

Technology

17 hrs ago

OpenAI's GPT-6: Revolutionizing AI with Memory and

DeepSeek and Baidu: China's Open-Source AI Revolution Challenges Western Dominance

Chinese AI companies DeepSeek and Baidu are making waves in the global AI landscape with their open-source models, challenging the dominance of Western tech giants and potentially reshaping the AI industry.

TechRadar logoVentureBeat logo

2 Sources

Technology

1 hr ago

DeepSeek and Baidu: China's Open-Source AI Revolution

The Rise of 'AI Psychosis': Mental Health Concerns Grow as AI Chatbots Proliferate

A comprehensive look at the emerging phenomenon of 'AI psychosis', its impact on mental health, and the growing concerns among experts and tech leaders about the psychological risks associated with AI chatbots.

Gizmodo logoFuturism logoThe Telegraph logo

3 Sources

Technology

1 hr ago

The Rise of 'AI Psychosis': Mental Health Concerns Grow as
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo