The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Sun, 22 Dec, 8:01 AM UTC
3 Sources
[1]
SK hynix preps 16-Hi HBM3E memory mass production in 2025, just before HBM4 makes its debut
TL;DR: SK hynix is preparing for mass production of its 16-Hi HBM3E memory, following its recent unveiling of the world's first 48GB version. The company is integrating new equipment and optimizing facilities, with production tests underway. Supply is expected in the first half of 2025, driven by demand from NVIDIA for AI GPU components. SK hynix has begun preparations for the mass production of its new 16-Hi HBM3E memory, weeks after unveiling the world's first 48GB 16-Hi HBM3E memory at the SK AI Summit in Seoul. In a new report from ETNews, we're learning that SK hynix is now integrating new equipment for 16-Hi HBM3E memory production, and optimizing its existing facilities. Production tests have reportedly already begun, with supply of the new 16-Hi HBM3E memory expected to flow in 1H 2025. An industry insider told ETNews: "It is the initial line preparation process for 16-layer HBM3E mass production. I understand that the results of the major process tests are also coming out well". SK hynix expediting its new 16-Hi HBM3E memory before product development news reached the world is strange, but with news that NVIDIA CEO Jensen Huang called the SK hynix boss asking him to bring forward next-gen HBM4 memory by 6 months... it makes sense. HBM is the key part of AI GPUs, and without it... there are no AI GPUs... and NVIDIA can't have that happen, ever. SK hynix is expected to use its 16-Hi stack with its next-gen HBM4 memory, so it will continue to dominate the AI memory market far into 2025, 2026, and beyond. NVIDIA's next-gen Rubin R100 AI GPU is expected to debut in 2025, rocking next-gen HBM4 memory.
[2]
Micron reveals 'cutting-edge' HBM4E development has started, HBM4 mass production in 2026
Micron has updated the industry on its next-generation HBM4 and HBM4E memory, with mass production expected to start in 2026 ready for next-gen AI GPUs like the NVIDIA Rubin R100. The AI memory market only has a few players with SK hynix leading the HBM market, Samsung and Micron have started spooling up production of their HBM3, HBM3E, and future-gen HBM4 and HBM4E memory for the insatiable requirements of the AI industry... it's mostly NVIDIA vacuuming up all HBM memory chips made. Sanjay Mehrotra, President and Chief Executive Officer of Micron said: "Leveraging the strong foundation and continued investments in proven 1β process technology, we expect Micron's HBM4 will maintain time to market and power efficiency leadership while boosting performance by over 50% over HBM3E. We expect HBM4 to ramp in high volume for the industry in calendar 2026".
[3]
Micron Reveals Development On "Cutting-Edge" HBM4E Process, HBM4 Slated For Mass-Production By 2026
Micron has provided an update on its next-gen HBM4 and HBM4E processes, as the firm reveals that mass production is expected to be initiated by 2026. HBM4 is indeed the future "holy grail" of the HBM markets, mainly since the technology pledges to bring in cutting-edge performance and efficiency figures, which is the gateway towards upscaling AI computational power. Micron, alongside the likes of SK Hynix and Samsung, is also in the race for HBM4 dominance, and in the latest investors conference, the firm has revealed that their HBM4 development is right on track, and "work is already underway" for HBM4E as well, which is indeed exciting to see. Leveraging the strong foundation and continued investments in proven 1β process technology, we expect Micron's HBM4 will maintain time to market and power efficiency leadership while boosting performance by over 50% over HBM3E. We expect HBM4 to ramp in high volume for the industry in calendar 2026. Development work is well underway with multiple customers on HBM4E, which will follow HBM4. HBM4E will introduce a paradigm shift in the memory business by incorporating an option to customize the logic base die for certain customers using an advanced logic foundry manufacturing process from TSMC. We expect this customization capability to drive improved financial performance for Micron. - Micron For those unaware, HBM4 is revolutionary in many ways, but one interesting point to note here is that the industry plans to integrate memory and logic semiconductors into a single package. This means that there won't be a need for packaging technology, and, given that individual dies would be much closer to this implementation, it would prove to be much more performance efficient. This is why Micron mentions that they will use TSMC as their "logic semiconductor" supplier, similar to what SK Hynix employs. Micron has also mentioned the presence of the HBM4E process, becoming the only first, alongside SK Hynix, to reveal development on the technology. While we are currently uncertain about Micron's HBM4 lineup specifications, the firm did reveal that HBM4 is expected to stack up to 16 DRAM dies, each with a capacity of 32 GB, along with a 2048-bit wide interface, making the technology much more superior to its previous-gen counterpart. In terms of adoption, HBM4 is expected to be featured in NVIDIA's Rubin AI architecture alongside AMD's Instinct MI400 Instinct lineup, so the process is set for widespread market recognition. HBM demand is at its peak right now, and Micron themselves have revealed production lines being booked by 2025, so the future is even going to be much brighter.
Share
Share
Copy Link
SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.
SK Hynix, a leading player in the High Bandwidth Memory (HBM) market, is ramping up preparations for the mass production of its innovative 16-Hi HBM3E memory. This move comes shortly after the company unveiled the world's first 48GB version of this technology at the SK AI Summit in Seoul 1.
According to industry sources, SK Hynix is currently integrating new equipment and optimizing existing facilities for 16-Hi HBM3E memory production. Production tests are already underway, with supply expected to begin in the first half of 2025. An industry insider reported, "It is the initial line preparation process for 16-layer HBM3E mass production. I understand that the results of the major process tests are also coming out well" 1.
While SK Hynix focuses on HBM3E, Micron is making strides in the development of next-generation HBM4 and HBM4E technologies. Sanjay Mehrotra, President and CEO of Micron, stated, "Leveraging the strong foundation and continued investments in proven 1β process technology, we expect Micron's HBM4 will maintain time to market and power efficiency leadership while boosting performance by over 50% over HBM3E" 2.
Micron anticipates HBM4 to ramp up in high volume for the industry by 2026. The company is also working on HBM4E, which promises to introduce a paradigm shift in the memory business. HBM4E will offer customization capabilities, allowing for improved financial performance for Micron 3.
HBM4 represents a significant leap forward in memory technology. It is expected to stack up to 16 DRAM dies, each with a capacity of 32 GB, along with a 2048-bit wide interface. This configuration makes HBM4 substantially superior to its predecessors 3.
The development of HBM4 also involves integrating memory and logic semiconductors into a single package, eliminating the need for separate packaging technology. This approach is expected to yield significant performance improvements. Micron plans to use TSMC as their logic semiconductor supplier for this purpose 3.
The demand for HBM is at an all-time high, primarily driven by the AI industry's insatiable appetite for high-performance memory. NVIDIA, a major player in AI GPUs, is reportedly the primary consumer of HBM memory chips. The company's CEO, Jensen Huang, has even requested SK Hynix to accelerate the development of HBM4 memory by six months 1.
Looking ahead, HBM4 is expected to be featured in NVIDIA's Rubin AI architecture and AMD's Instinct MI400 lineup, ensuring widespread market adoption. With production lines booked through 2025 and beyond, the future of HBM technology appears promising for manufacturers and consumers alike 3.
Reference
[1]
[2]
SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.
5 Sources
5 Sources
SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.
12 Sources
12 Sources
SK Hynix has started mass production of its cutting-edge 12-layer HBM3E memory modules, offering 36GB capacity per module and speeds up to 9.6 Gbps. This breakthrough is set to revolutionize high-performance computing and AI applications.
9 Sources
9 Sources
SK Hynix, a leading South Korean chipmaker, announces plans to start mass production of advanced HBM3E 12-layer memory chips this month, aiming to meet the growing demand for AI applications.
3 Sources
3 Sources
Rambus has announced details of its HBM4 memory controller, promising significant improvements in speed, bandwidth, and capacity. This new technology could revolutionize high-performance computing and AI applications.
2 Sources
2 Sources