5 Sources
[1]
HBM4 Memory by Micron: 36GB Capacity and 2TB/s Bandwidth Samples Shipping Soon
Micron is preparing to ship early samples of its new HBM4 memory modules, which stack 12 DRAM dies to offer 36 GB of capacity per module. These engineering samples will be sent to select partners soon, with mass production expected to start in early 2026. The HBM4 modules are built on Micron's 1β process, a DRAM manufacturing technology that's been in use since 2022. Later this year, Micron plans to introduce an EUV-enhanced 1γ process for DDR5 memory, but HBM4 currently relies on the established 1β node. One of the main improvements with HBM4 is the wider memory interface. The interface width doubles from 1,024 bits per stack in previous generations to 2,048 bits, allowing the modules to deliver a sustained bandwidth of 2 terabytes per second. This translates to roughly a 20% efficiency increase over the current HBM3E memory standard. The increased number of stacked dies combined with this wider interface makes data movement more efficient, which is especially important for multi-chip configurations and setups requiring memory-coherent interconnects. NVIDIA and AMD are expected to be the first companies to adopt HBM4 memory in their products. NVIDIA plans to use the modules in its Rubin-Vera AI accelerators, which are set to launch in the second half of 2026. Meanwhile, AMD will likely integrate HBM4 into its next-generation Instinct MI400 series, with more information anticipated at their Advancing AI 2025 conference. The added memory capacity and bandwidth of HBM4 address the increasing performance needs of generative AI, high-performance computing, and other data-heavy applications. As Micron moves towards large-scale production, the company will face challenges around thermal management and proving real-world performance. Thermal performance is critical since higher stack counts and increased bandwidth can generate more heat. Real-world benchmarks will also be essential to demonstrate how effectively HBM4 supports demanding AI workloads and HPC tasks. Source: Micron
[2]
Micron begins shipping HBM4 memory to key partners for next-gen AI systems
As an Amazon Associate, we earn from qualifying purchases. TweakTown may also earn commissions from other affiliate partners at no extra cost to you. Micron has announced that it has started shipping its new HBM4 36GB 12-Hi memory to "multiple key customers" with the first to likely be NVIDIA and its next-gen Rubin R100 AI GPU. In a press release, the US-based memory maker said it's now shipping HBM4 36GB 12-Hi memory to select customers, built on its well-established 1b DRAM process node, 12-high advanced packaging technology, and highly capable memory built-in self-test (MBIST) feature. Micron says its new HBM4 provides seamless integration for customers and partners developing next-generation AI platforms. Micron's new HBM4 has a 2048-bit memory interface that pushes more than 2.0TB/sec per memory stack, and over 60% more performance over its previous-gen HBM3 memory. These expansions in speed are perfect for AI, which is where HBM4 will lie: with AI GPUs inside of servers. Micron says its new HBM4 memory is over 20% more power efficiency compared to its HBM3E memory, with the improvement providing maximum throughput with the lowest power consumption to maximize data center efficiency. Raj Narasimhan, senior vice president and general manager of Micron's Cloud Memory Business Unit, explains: "Micron HBM4's performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership. Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers' next-generation AI platform readiness to ensure seamless integration and volume ramp".
[3]
Micron Ships Advanced HBM4 Memory To Fuel Next-Generation AI - Micron Technology (NASDAQ:MU)
Micron Technology MU on Tuesday announced the shipment of HBM4 36GB 12-high samples to multiple key customers. Built on its 1ß (1-beta) DRAM process, 12-high advanced packaging technology, and memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better performance than the previous generation. Also Read: ASML Bets Big On High-Powered Chip Tools As Taiwan Semiconductor, Intel, Micron Line Up This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron's previous-generation HBM3E products to maximize data center efficiency. HBM4 is a crucial enabler, driving quicker insights and discoveries to foster innovation in diverse fields such as healthcare, finance, and transportation, the company said in a press release. Micron plans to ramp HBM4 in 2026, aligned with the ramp of customers' next-generation AI platforms. Last week, Micron announced that it had started shipping samples of its new 1-gamma LPDDR5X memory, built for AI smartphone use. Micron Technology stock has been down by over 15% in the last 12 months, as the Trump administration's tariff policies have affected the company. In May, CNBC's Jim Cramer said Micron Technology stock was overpriced, making it unsuitable for investment. Price Action: MU stock is trading higher by 2.10% to $113.25 at last check Tuesday. Read Next: These Semiconductor Stocks To Face Least Tariffs Impact Compared To Intel Photo via Shutterstock MUMicron Technology Inc$114.783.45%Stock Score Locked: Want to See it? Benzinga Rankings give you vital metrics on any stock - anytime. Reveal Full ScoreEdge RankingsMomentum39.15Growth68.39Quality79.66Value61.50Price TrendShortMediumLongOverview This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Market News and Data brought to you by Benzinga APIs
[4]
Micron Ships Next-Gen HBM4 Memory To Key Customers: Offering 36 GB Capacity & Over 2 TB/s Bandwidth
Micron is sampling next-gen HBM4 memory to key customers, bringing high-performance speeds and massive capacities to AI platforms. Micron HBM4 Memory Sampling Commences To Key Customers: 12-Hi Solution First With 36 GB Capacity & Over 2 TB/s Speeds Press Release: Micron Technology announced the shipment of HBM4 36GB 12-high samples to multiple key customers today. This milestone extends Micron's leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology, and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms. A leap forward As the use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds of greater than 2.0 TB/s per memory stack and more than 60% better performance over the previous generation. This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Simply put, HBM4 will help AI accelerators respond faster and reason more effectively. Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron's previous-generation HBM3E products, which first established new, unrivaled benchmarks in HBM power efficiency in the industry. This improvement provides maximum throughput with the lowest power consumption to maximize data center efficiency. Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance, and transportation. Intelligence Accelerated: Micron's role in the AI revolution For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Today, Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers' most demanding solutions. Micron plans to ramp HBM4 in calendar year 2026, aligned to the ramp of customers' next-generation AI platforms.
[5]
Micron Ships HBM4 to Key Customers to Power Next-Gen AI Platforms
Micron HBM4 36GB 12-high products lead the industry in power efficiency for data center and cloud AI acceleration A Media Snippet accompanying this announcement is available in this link. BOISE, Idaho, June 10, 2025 (GLOBE NEWSWIRE) -- The importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology, Inc. (Nasdaq: MU), today announced the shipment of HBM4 36GB 12-high samples to multiple key customers. This milestone extends Micron's leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms. A leap forward As use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better performance over the previous generation. This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Simply put, HBM4 will help AI accelerators respond faster and reason more effectively. Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron's previous-generation HBM3E products, which first established new, unrivaled benchmarks in HBM power efficiency in the industry. This improvement provides maximum throughput with the lowest power consumption to maximize data center efficiency. Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation. "Micron HBM4's performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron's Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers' next-generation AI platform readiness to ensure seamless integration and volume ramp." Intelligence Accelerated: Micron's role in the AI revolution For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Today, Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers' most demanding solutions. Micron plans to ramp HBM4 in calendar year 2026, aligned to the ramp of customers' next-generation AI platforms. For more information on Micron HBM4, visit https://www.micron.com/products/memory/hbm. Additional resources: About Micron Technology, Inc. Micron Technology, Inc. is an industry leader in innovative memory and storage solutions, transforming how the world uses information to enrich life for all. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micronand Crucialbrands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence (AI) and compute-intensive applications that unleash opportunities -- from the data center to the intelligent edge and across the client and mobile user experience. To learn more about Micron Technology, Inc. (Nasdaq: MU), visit micron.com. © 2025 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners. Based on internal Micron HBM4 testing and published HBM3E specifications (2.0 TB/s vs. 1.2 TB/s). Based on internal Micron simulation projections in comparison to Micron HBM3E 36GB 12-high and similar competitive products.
Share
Copy Link
Micron has started shipping samples of its new HBM4 memory to key customers, offering 36GB capacity and over 2TB/s bandwidth. This next-generation memory technology promises significant improvements for AI and high-performance computing applications.
Micron Technology has announced a significant milestone in memory technology with the shipment of its new High Bandwidth Memory 4 (HBM4) samples to key customers 1. This next-generation memory solution promises to revolutionize AI and high-performance computing applications with its impressive specifications and performance improvements.
Source: TweakTown
The HBM4 modules boast a capacity of 36GB per stack, achieved by stacking 12 DRAM dies 1. One of the most significant advancements is the wider memory interface, which has doubled from 1,024 bits per stack in previous generations to 2,048 bits 1. This expansion allows the modules to deliver a sustained bandwidth exceeding 2 terabytes per second, marking a substantial leap in performance 2.
Micron reports that HBM4 offers more than 60% better performance over the previous generation 3. Additionally, it boasts over 20% better power efficiency compared to Micron's previous-generation HBM3E products 5. These improvements are crucial for maximizing data center efficiency and supporting the growing demands of AI workloads.
The primary applications for HBM4 are in AI acceleration and high-performance computing. The increased memory capacity and bandwidth address the escalating performance needs of generative AI, large language models, and other data-intensive applications 1. Raj Narasimhan, senior vice president at Micron, emphasized that HBM4 will help AI accelerators respond faster and reason more effectively 5.
Source: Benzinga
NVIDIA and AMD are expected to be among the first companies to adopt HBM4 memory in their products 1. NVIDIA plans to use the modules in its Rubin-Vera AI accelerators, set to launch in the second half of 2026 1. AMD is likely to integrate HBM4 into its next-generation Instinct MI400 series 1.
Micron plans to ramp up HBM4 production in 2026, aligning with the launch of customers' next-generation AI platforms 5. As the company moves towards large-scale production, it will face challenges in thermal management and demonstrating real-world performance, particularly in supporting demanding AI workloads and HPC tasks 1.
AMD CEO Lisa Su reveals new MI400 series AI chips and partnerships with major tech companies, aiming to compete with Nvidia in the rapidly growing AI chip market.
8 Sources
Technology
2 hrs ago
8 Sources
Technology
2 hrs ago
Meta has filed a lawsuit against Joy Timeline HK Limited, the developer of the AI 'nudify' app Crush AI, for repeatedly violating advertising policies on Facebook and Instagram. The company is also implementing new measures to combat the spread of AI-generated explicit content across its platforms.
17 Sources
Technology
10 hrs ago
17 Sources
Technology
10 hrs ago
Mattel, the iconic toy manufacturer, partners with OpenAI to incorporate artificial intelligence into toy-making and content creation, promising innovative play experiences while prioritizing safety and privacy.
14 Sources
Business and Economy
10 hrs ago
14 Sources
Business and Economy
10 hrs ago
A critical security flaw named "EchoLeak" was discovered in Microsoft 365 Copilot, allowing attackers to exfiltrate sensitive data without user interaction. The vulnerability highlights potential risks in AI-integrated systems.
5 Sources
Technology
18 hrs ago
5 Sources
Technology
18 hrs ago
Spanish AI startup Multiverse Computing secures $217 million in funding to advance its quantum-inspired AI model compression technology, promising to dramatically reduce the size and cost of running large language models.
5 Sources
Technology
10 hrs ago
5 Sources
Technology
10 hrs ago