Curated by THEOUTPOST
On Fri, 6 Sept, 4:02 PM UTC
3 Sources
[1]
Micron Unveils 12-High HBM3E Memory with 36 GB Capacity and 1.2 TB/s Bandwidth
As AI workloads become more demanding, the need for higher memory bandwidth and capacity grows. Micron is addressing this challenge by shipping its HBM3E 12-high 36 GB memory for qualification across the AI ecosystem. The 12-high stack provides 50% more capacity compared to the 8-high offerings, reaching 36 GB. This increase in memory allows for larger AI models, such as Llama 2 with 70 billion parameters, to be processed on a single GPU, reducing CPU offload and inter-GPU communication. Despite the higher capacity, Micron's 12-high HBM3E consumes less power than competing 24 GB 8-high solutions. Micron's HBM3E 12-high memory delivers over 1.2 terabytes per second of bandwidth, supported by pin speeds exceeding 9.2 gigabits per second. These features offer maximum throughput while minimizing power consumption, making it well-suited for energy-intensive data centers. Additionally, the memory includes fully programmable memory built-in self-test (MBIST) capabilities to simulate system traffic at full specification speeds, improving validation coverage and accelerating time to market. Micron is actively partnering with key industry players to qualify its HBM3E 12-high memory across various AI applications. This milestone aligns with the company's strategic efforts to meet the growing demands of data-intensive AI infrastructures. Micron is also collaborating with TSMC as part of the 3DFabric Alliance. This partnership focuses on advancing semiconductor and system innovations, supporting complex AI system manufacturing, and integrating HBM3E into various technologies. TSMC and Micron have worked together to enable HBM3E-based systems and chip-on-wafer-on-substrate (CoWoS) packaging designs that support AI advancements. Qualification in progress: Micron is shipping production-ready HBM3E 12-high units for qualification across the AI ecosystem. Scalability: The 36 GB capacity (50% more than current HBM3E offerings) enables data centers to manage growing AI workloads effectively. Energy Efficiency: The 12-high memory stack consumes less power compared to competing 8-high solutions with 24 GB. Performance: With a pin speed greater than 9.2 Gb/s, Micron's HBM3E 12-high delivers over 1.2 TB/s of memory bandwidth, ensuring rapid data access for AI accelerators and data centers. Accelerated Validation: Fully programmable MBIST capabilities improve test coverage and system reliability, reducing time to market.
[2]
Micron's HBM3E Memory Boosts Capacities Up To 36 GB With 12-Hi Design, 9.2 Gbps Speeds, 1.2 TB/s Bandwidth
Micron has started sampling its "production-ready" HBM3E memory solution which features up to 36 GB capacities in a 12-Hi design. Press Release: Micron is at the forefront of memory innovation to meet these needs and is now shipping production-capable HBM3E 12-high to key industry partners for qualification across the AI ecosystem. Micron HBM3E 12-high boasts an impressive 36GB capacity, a 50% increase over current HBM3E 8-high offerings, allowing larger AI models like Llama 2 with 70 billion parameters to run on a single processor. This capacity increase allows faster time to insight by avoiding CPU offload and GPU-GPU communication delays. Micron HBM3E 12-high 36GB delivers significantly lower power consumption than the competitors' HBM3E 8-high 24GB solutions. Micron HBM3E 12-high 36GB offers more than 1.2 terabytes per second (TB/s) of memory bandwidth at a pin speed greater than 9.2 gigabits per second (Gb/s). These combined advantages of HBM3E offer maximum throughput with the lowest power consumption and can ensure optimal outcomes for power-hungry data centers. Additionally, Micron HBM3E 12-high incorporates fully programmable MBIST that can run system representative traffic at full spec speed, providing improved test coverage for expedited validation enabling faster time to market, and enhancing system reliability. Micron is now shipping production-capable HBM3E 12-high units to key industry partners for qualification across the AI ecosystem. This HBM3E 12-high milestone demonstrates Micron's innovations to meet the data-intensive demands of the evolving AI infrastructure. Micron is also a proud partner in TSMC's 3DFabric Alliance, which helps shape the future of semiconductor and system innovations. AI system manufacturing is complex, and HBM3E integration requires close collaboration between memory suppliers, customers, and outsourced semiconductor assembly and test (OSAT) players.
[3]
Micron samples 12-Hi HBM3E with up to 36GB capacity: 9.2Gbps speeds, 1.2TB/sec memory bandwidth
Micron has just announced its shipping production-capable HBM3E 12-Hi memory in up to 36GB capacities, pushing 1.2TB/sec of memory bandwidth ready for AI GPUs. The new Micron HBM3E 12-Hi features an impressive 36GB capacity, which is a 50% increase over current HBM3E 8-Hi stacks, allowing far larger AI models like Llama 2 with 70 billion parameters to run on a single AI processor. This increased capacity to 36GB allows faster time to insight by avoiding CPU offload and GPU-GPU communication delays., Micron's new HBM3E 12-Hi 36GB delivers "significantly" lower power consumption than competitors' HBM3E 8-Hi 24GB memory, with Micron's new HBM3E 12-Hi memory pushing over 1.2TB/sec of memory bandwidth at a pin speed of over 9.2Gbps. These combined benefits of Micron's new HBM3E memory modules over maximum throughput with the lowest power consumption, ensuring optimal outcomes for power-hungry data centers of the future. The company also adds that its new HBM3E 12-Hi memory features fully programmable MBIST that can run system representative traffic at full-spec speed, providing improved test coverage for expedited validation enabling faster time-to-market (TTM), and enhancing system reliability. In summary, here are the Micron HBM3E 12-high 36GB highlights:
Share
Share
Copy Link
Micron Technology has introduced its latest High Bandwidth Memory (HBM) solution, the HBM3E, featuring unprecedented capacity and bandwidth. This advancement promises significant improvements for AI and high-performance computing applications.
Micron Technology, a leader in innovative memory solutions, has announced its latest breakthrough in High Bandwidth Memory (HBM) technology. The new HBM3E memory stack represents a significant advancement in capacity, speed, and bandwidth, positioning itself at the forefront of memory solutions for artificial intelligence (AI) and high-performance computing (HPC) applications 1.
The HBM3E memory stack boasts an impressive 36GB capacity, achieved through a 12-High (12-Hi) configuration. This substantial increase in capacity is complemented by speeds of up to 9.2 Gbps, resulting in a remarkable bandwidth of 1.2 TB/s per package 2. These specifications represent a significant leap forward in memory technology, offering 50% more capacity and 50% higher bandwidth compared to the previous generation.
Micron's HBM3E utilizes advanced packaging technology, including Through-Silicon Vias (TSVs) and microbumps. The 12-Hi stack configuration allows for increased density without compromising performance. The memory is built on Micron's 1β (1-beta) process node, which contributes to its enhanced capabilities and efficiency 3.
The introduction of HBM3E is particularly significant for the AI and HPC sectors. With the increasing complexity of AI models and the growing demand for high-performance computing, the need for faster, higher-capacity memory solutions has never been more critical. Micron's HBM3E addresses these needs by providing:
Micron is currently sampling its HBM3E memory to select partners, with plans for broader availability in the near future. The company is working closely with ecosystem partners to ensure seamless integration of the new memory technology into next-generation computing systems 1.
The introduction of HBM3E sets a new benchmark in the memory industry. As AI and HPC continue to evolve, technologies like Micron's HBM3E will play a crucial role in enabling more sophisticated and powerful computing solutions. The advancements in capacity and bandwidth are expected to drive innovation across various sectors, from scientific research to consumer electronics.
Reference
[2]
SK Hynix has started mass production of its cutting-edge 12-layer HBM3E memory modules, offering 36GB capacity per module and speeds up to 9.6 Gbps. This breakthrough is set to revolutionize high-performance computing and AI applications.
9 Sources
9 Sources
SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.
5 Sources
5 Sources
SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.
3 Sources
3 Sources
Rambus has announced details of its HBM4 memory controller, promising significant improvements in speed, bandwidth, and capacity. This new technology could revolutionize high-performance computing and AI applications.
2 Sources
2 Sources
SK Hynix, a leading South Korean chipmaker, announces plans to start mass production of advanced HBM3E 12-layer memory chips this month, aiming to meet the growing demand for AI applications.
3 Sources
3 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved