The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Thu, 26 Sept, 8:04 AM UTC
9 Sources
[1]
SK hynix begins mass production of 36 GB 12-layer HBM3E
Korea's SK hynix revealed on Thursday that it had become the first chip manufacturer to mass produce the much-anticipated 36 GB 12-layer HBM3E chip. The chips are slated to be in the hands of customers by the end of this year. High-bandwidth memory (HBM) chips utilize a stacked design, where layers of DRAM are interconnected through through-silicon vias (TSVs), allowing for greater memory density in a more compact form factor. The chips are eagerly sought thanks to the recent surge in high-demand areas like AI, GPUs, and supercomputers. SK hynix's previous max capacity was 24 GB HBM3E, achieved by stacking eight 3 GB DRAM chips vertically, as our sister publication Blocks & Files reported at the time. mass production began just six months ago in March, to the benefit of key customer Nvidia. Demand for the product was so great that by May, market analyst firm Trendforce had warned that hunger for HBM capacity could cause a shortage of DRAM supply as companies ditched the latter to make the AI enablers on their limited number of production lines. The layers of the new HBM3E are 40 percent thicker than the last generation. By stacking four extra layers of 3 GB chips, the company claims it has increased capacity by 50 percent while keeping the same thickness. To combat the frailty inherent in the chip's thin layers, SK hynix embarked on treating them with Advanced MR-MUF (Mass Reflow Molded Underfill) - a process where liquid protective materials are injected between stacked chips to protect the circuits and enhance heat dissipation then hardened. The new chip product, according to SK hynix, has "increased the speed of memory operations to 9.6 Gbps, the highest memory speed available today." "If 'Llama 3 70B', a Large Language Model (LLM), is driven by a single GPU equipped with four HBM3E products, it can read 70 billion total parameters 35 times within a second," bragged the company. SK hynix shares rose 9 percent following today's announcement, surpassing the share growth of rivals Samsung Electronics and Micron, according to media reports. The company's new memory may help South Korea to become one of the world's top three AI nations, a goal set the same day in an announcement by President Yoon Suk Yeol. The government said it would achieve the goal through public-private collaboration. Yoon also pledged to build a National AI Computing Center and reform regulations to support SK hynix's efforts. The country has put a "Presidential AI Committee" in charge of coordinating the R&D push - a committee that consists of 30 experts, 10 ministerial level government officials, and two presidential aides. ®
[2]
SK hynix Initiates Mass Production of Advanced 12-Layer HBM3E Memory Modules
SK hynix Inc. has commenced mass production of its 12-layer High Bandwidth Memory 3E (HBM3E) modules, featuring a total capacity of 36 GB. This marks the largest capacity achieved in HBM technology to date. The company aims to supply these mass-produced units to clients within the current year, reinforcing its technological advancements. This development follows the initial delivery of the 8-layer HBM3E products to customers six months prior in March, underscoring SK hynix's ongoing progress in the HBM3E sector. As the sole manufacturer offering a comprehensive HBM product range from the first generation (HBM1) through to the fifth generation (HBM3E) since introducing the initial HBM in 2013, SK hynix maintains a significant position in the memory market. The company's focus remains on the artificial intelligence (AI) memory segment, where it addresses the increasing demands of AI enterprises. By leading the industry in the mass production of 12-layer HBM3E modules, SK hynix demonstrates its commitment to meeting the high-performance requirements essential for advanced AI applications. The newly produced 12-layer HBM3E modules adhere to stringent global standards critical for AI memory, encompassing speed, capacity, and stability. The memory operation speed has been enhanced to 9.6 Gbps, representing the highest available memory speed currently. For instance, when running the Llama 3 70B large language model on a single GPU equipped with four HBM3E modules, the system can process 70 billion parameters 35 times per second. To achieve a 50% increase in capacity, SK hynix has implemented a vertical stacking of twelve 3 GB DRAM chips, maintaining the same thickness as the previous eight-layer configuration. This was accomplished by reducing each DRAM chip's thickness by 40% and utilizing through-silicon via (TSV) technology for vertical integration. Additionally, SK hynix addressed structural challenges associated with stacking thinner chips by employing its Advanced MR-MUF process. This innovation enhances heat dissipation performance by 10% compared to the previous generation and ensures product stability and reliability through improved warpage control.
[3]
SK Hynix Begins Mass-Production of 12-Layer HBM3E Memory: 36 GB Capacity Per Module, 9.6 Gbps Speeds
SK hynix announces volume-production of its high-end 12-layer HBM3E memory, driving the transition toward the next era of AI computing. [Press Release]: SK hynix announced today that it has begun mass production of the world's first 12-layer HBM3E product with 36GB, the largest capacity of existing HBM to date. The company plans to supply mass-produced products to customers within the year, proving its overwhelming technology once again six months after delivering the HBM3E 8-layer product to customers for the first time in the industry in March this year. SK hynix is the only company in the world that has developed and supplied the entire HBM lineup from the first generation (HBM1) to the fifth generation (HBM3E), since releasing the world's first HBM in 2013. The company plans to continue its leadership in the AI memory market, addressing the growing needs of AI companies by being the first in the industry to mass-produce the 12-layer HBM3E. According to the company, the 12-layer HBM3E product meets the world's highest standards in all areas that are essential for AI memory including speed, capacity, and stability. SK hynix has increased the speed of memory operations to 9.6 Gbps, the highest memory speed available today. If 'Llama 3 70B, a Large Language Model (LLM), is driven by a single GPU equipped with four HBM3E products, it can read 70 billion total parameters 35 times within a second. SK hynix has once again broken through technological limits demonstrating our industry leadership in AI memory. We will continue our position as the No.1 global AI memory provider as we steadily prepare next-generation memory products to overcome the challenges of the AI era. - Justin Kim, President (Head of AI Infra) at SK hynix SK hynix has increased the capacity by 50% by stacking 12 layers of 3GB DRAM chips at the same thickness as the previous eight-layer product. To achieve this, the company made each DRAM chip 40% thinner than before and stacked vertically using TSV technology. The company also solved structural issues that arise from stacking thinner chips higher by applying its core technology, the Advanced MR-MUF process. This allows to provide 10% higher heat dissipation performance compared to the previous generation and secure the stability and reliability of the product through enhanced warpage controlling.
[4]
SK hynix starts mass production of 12-layer HBM3E memory: 36GB capacity per module @ 9.6Gbps
SK hynix has announced volume production of its new 12-layer HBM3E memory, with up to 36GB capacities and speeds of 9.6Gbps. The South Korean memory leader announced it has started mass production of the world's first 12-layer HBM3E memory with 36GB, the largest capacity of existing HBM to date. SK hynix plans to supply mass-produced 12-layer HBM3E memory chips to companies (NVIDIA) within the next 12 months, and only 6 months after launching 8-layer HBM3E to customers for the first time in the industry in March 2024. SK hynix is the key to the world of AI chips, with NVIDIA using its HBM3 and HBM3E memory inside of its Hopper H100 and H200 AI GPUs, with HBM3E also used in its new Blackwell AI GPUs. SK hynix has been leading the industry with HBM, with its new 12-layer HBM3E memory chips boosted up to 9.6Gbps of bandwidth, the highest memory speed on the market. SK hynix says that its new 12-layer HBM3E memory at 9.6Gbps bandwidth if running a Llama 3 70B LLM and a single AI GPU with 4 HBM3E products, it can read 70 billion total parameters 35 times within a single second. Not too shabby at all, SK hynix.
[5]
SK hynix starts mass production of world's first 12-layer HBM3E
This undated picture provided by SK hynix shows the company's newest 36-gigabit 12-layer HBM3E chip. Yonhap SK hynix, the world's second-largest memory chipmaker, said Thursday it has begun mass production of 12-layer high bandwidth memory (HBM) chips, the first in the world, solidifying its competitive edge over rivals. The new 36-gigabit 12-layer HBM3E chip will be supplied to its customers, including U.S. AI chip giant Nvidia, within the year, according to SK hynix. This marks the industry's first mass production of the highest capacity and fastest HBM chip to date, outpacing other major HBM manufacturers like Samsung Electronics and Micron Technology. SK hynix, already a leader in the HBM market, first began supplying its 8-layer HBM3E chips to Nvidia in March. The popularity of SK hynix's HBM products came as HBM chips, integral components used for AI computing, have garnered increasing attention with the rise of applications, such as generative AI, exemplified by models like ChatGPT. The newly launched 12-layer HBM3E chip is poised to dominate the future HBM market, with reports indicating that Nvidia's upcoming product lineups will feature this chip, for which SK hynix is the sole producer. "The 12-layer HBM3E product meets the world's highest standards in all areas that are essential for AI memory, including speed, capacity and stability," SK hynix said. "The company plans to continue its leadership in the AI memory market, addressing the growing needs of AI companies by being the first in the industry to mass-produce the 12-layer HBM3E." According to data by market analysis firm TrendForce, SK hynix led the HBM market last year with a market share of 53 percent, followed by Samsung Electronics at 38 percent and Micron at 9 percent. (Yonhap)
[6]
SK hynix Begins Volume Production of the World's First 12-Layer HBM3E By Investing.com
The company plans to supply mass-produced products to customers within the year, proving its overwhelming technology once again six months after delivering the HBM3E 8-layer product to customers for the first time in the industry in March this year. SK hynix is the only company in the world that has developed and supplied the entire HBM lineup from the first generation (HBM1) to the fifth generation (HBM3E), since releasing the world's first HBM in 2013. The company plans to continue its leadership in the AI memory market, addressing the growing needs of AI companies by being the first in the industry to mass-produce the 12-layer HBM3E. According to the company, the 12-layer HBM3E product meets the world's highest standards in all areas that are essential for AI memory including speed, capacity and stability. SK hynix has increased the speed of memory operations to 9.6 Gbps, the highest memory speed available today. If 'Llama 3 70B', a Large Language Model (LLM), is driven by a single GPU equipped with four HBM3E products, it can read 70 billion total parameters 35 times within a second. SK hynix has increased the capacity by 50% by stacking 12 layers of 3GB DRAM chips at the same thickness as the previous eight-layer product. To achieve this, the company made each DRAM chip 40% thinner than before and stacked vertically using TSV technology. The company also solved structural issues that arise from stacking thinner chips higher by applying its core technology, the Advanced MR-MUF process. This allows to provide 10% higher heat dissipation performance compared to the previous generation, and secure the stability and reliability of the product through enhanced warpage controlling. "SK hynix has once again broken through technological limits demonstrating our industry leadership in AI memory," said , President (Head of AI Infra) at SK hynix. "We will continue our position as the No.1 global AI memory provider as we steadily prepare next-generation memory products to overcome the challenges of the AI era." About SK hynix Inc. SK hynix Inc., headquartered in , is the world's top-tier semiconductor supplier offering Dynamic Random Access Memory chips ("DRAM"), flash memory chips ("NAND flash"), and CMOS Image Sensors ("CIS") for a wide range of distinguished customers globally. The Company's shares are traded on the Korea Exchange, and the Global Depository shares are listed on the Luxemburg Stock Exchange. Further information about SK hynix is available at www.skhynix.com, news.skhynix.com.
[7]
Nvidia supplier SK Hynix says begins mass production of 12-layer HBM3E chips
SEOUL (Reuters) - The world's second-largest memory chipmaker SK Hynix said on Thursday it began mass production of a 12-layer version of the latest generation of high-bandwidth memory (HBM) chips, to meet demand from the current AI boom. The Nvidia supplier said in a statement it was the world's first latest-generation HBM product, called HBM3E, with 12-layers and the largest capacity of existing HBM to date at 36 gigabytes. (Reporting by Joyce Lee; Editing by Jacqueline Wong)
[8]
Nvidia supplier SK Hynix says begins mass production of 12-layer HBM3E chips
SEOUL, Sept 26 (Reuters) - The world's second-largest memory chipmaker SK Hynix (000660.KS), opens new tab said on Thursday it began mass production of a 12-layer version of the latest generation of high-bandwidth memory (HBM) chips, to meet demand from the current AI boom. The Nvidia (NVDA.O), opens new tab supplier said in a statement it was the world's first latest-generation HBM product, called HBM3E, with 12-layers and the largest capacity of existing HBM to date at 36 gigabytes. Reporting by Joyce Lee; Editing by Jacqueline Wong Our Standards: The Thomson Reuters Trust Principles., opens new tab
[9]
SK hynix preps for Nvidia Blackwell Ultra and AMD Instinct MI325X with 12-Hi HBM3E
SK hynix has started mass production of its 12-Hi HBM3E memory stacks, ahead of its rivals. The new modules feature a 36GB capacity and set the stage for next-generation AI and HPC processors, such as AMD's Instinct MI325X which is due in the fourth quarter, and Nvidia's Blackwell Ultra which is expected to arrive in the second half of next year. SK hynix's 12-Hi 36GB HBM3E stacks pack twelve 3GB DRAM layers and feature a data transfer rate of 9.6 GT/s, thus providing a peak bandwidth of 1.22 TB/s per module. A memory subsystem featuring eight of the company's 12-Hi 36GB HBM3E stacks will thus offer a peak bandwidth of 9.83 TB/s. Real-world products will unlikely use these HBM3E memory devices at their full speed as developers tend to ensure ultimate reliability. We don't doubt that HBM3E memory subsystems will offer higher performance than their predecessors, though. Despite packing 50% more memory devices, the new 12-Hi HBM3E memory stacks feature the same z-height as their 8-Hi predecessors. To achieve this, SK hynix made DRAM devices 40% thinner. Also, to avoid structural issues that arise from using ultra-thin vertically stacked DRAMs interconnected using through silicon vias (TSVs), the manufacturer used its mass reflow molded underfill (MR-MUF) process that bonds the dies together all at once and fills the space between them with an improved underfill called liquid Epoxy Molding Compound. As a bonus, EMC also has better thermal conductivity. SK hynix is the first company to start mass production of 12-Hi HBM3E memory. While Samsung formally introduced its 12-Hi 36GB HBM3E stacks early this year, it has yet to start mass production of these products. Micron is sampling production-ready 12-Hi HBM3E devices, but it has yet to start high-volume production of these memory stacks. SK hynix plans to ship its 12-Hi 36GB HBM3E memory stacks by the end of the year, in time for AMD's Instinct MI325X accelerator for AI and HPC that will carry 244GB of HBM3E memory, and several quarters before Nvidia intends to start shipments of its Blackwell Ultra GPU for AI and HPC applications. "SK hynix has once again broken through technological limits demonstrating our industry leadership in AI memory," said Justin Kim, President (Head of AI Infra) at SK hynix. "We will continue our position as the No.1 global AI memory provider as we steadily prepare next-generation memory products to overcome the challenges of the AI era."
Share
Share
Copy Link
SK Hynix has started mass production of its cutting-edge 12-layer HBM3E memory modules, offering 36GB capacity per module and speeds up to 9.6 Gbps. This breakthrough is set to revolutionize high-performance computing and AI applications.
SK Hynix, a leading semiconductor manufacturer, has announced the commencement of mass production for its groundbreaking 12-layer High Bandwidth Memory 3E (HBM3E) modules 1. This development marks a significant milestone in the memory industry, promising to deliver unprecedented performance and capacity for high-end computing applications.
The new HBM3E modules boast impressive specifications that set them apart from their predecessors. Each module offers a substantial 36GB capacity, achieved through the innovative 12-layer design 2. This represents a 50% increase in capacity compared to the previous generation of HBM3 modules.
In terms of speed, the HBM3E modules operate at an astounding 9.6 Gbps per pin, translating to a remarkable bandwidth of 1.23 TB/s per module 3. This significant boost in performance is expected to have far-reaching implications for various high-performance computing applications.
SK Hynix has achieved this technological breakthrough by employing advanced manufacturing techniques. The company utilizes its fourth-generation 10nm-class process technology, known as 1anm, to produce these cutting-edge memory modules 4.
The development of 12-layer HBM3E presented significant challenges, particularly in maintaining stability while stacking an increased number of layers. SK Hynix overcame these obstacles through innovative engineering solutions, including the use of advanced materials and refined manufacturing processes.
The introduction of HBM3E memory is expected to have a profound impact on various sectors, particularly those relying on high-performance computing and artificial intelligence. Industries such as data centers, scientific research, and advanced AI applications stand to benefit significantly from the increased capacity and speed offered by these new memory modules 5.
SK Hynix's achievement in mass-producing 12-layer HBM3E modules positions the company at the forefront of the memory market. This development is likely to spark further innovation and competition in the industry, potentially leading to accelerated advancements in computing capabilities across various sectors.
The semiconductor industry has recognized SK Hynix's achievement as a significant leap forward. The company's success in overcoming the technical challenges associated with 12-layer stacking has set a new benchmark for memory technology.
As demand for high-performance computing continues to grow, particularly in AI and machine learning applications, SK Hynix's HBM3E modules are poised to play a crucial role in enabling the next generation of technological advancements. The company's ability to mass-produce these advanced modules suggests a promising future for memory technology and its applications in cutting-edge computing systems.
Reference
[1]
[3]
[4]
[5]
SK Hynix, a leading South Korean chipmaker, announces plans to start mass production of advanced HBM3E 12-layer memory chips this month, aiming to meet the growing demand for AI applications.
3 Sources
3 Sources
SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.
5 Sources
5 Sources
SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.
3 Sources
3 Sources
SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.
12 Sources
12 Sources
Micron Technology has introduced its latest High Bandwidth Memory (HBM) solution, the HBM3E, featuring unprecedented capacity and bandwidth. This advancement promises significant improvements for AI and high-performance computing applications.
3 Sources
3 Sources