The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Wed, 4 Sept, 4:08 PM UTC
3 Sources
[1]
SK Hynix to start mass producing HBM3E 12-layer chips this month
TAIPEI (Reuters) - SK Hynix, the world's second-largest memory chip maker, will start mass production of HBM3E 12-layer chips by the end of this month, a senior executive said on Wednesday. Justin Kim, president and head of the company's AI Infra division, made the comment at the Semicon Taiwan industry forum in Taipei. In July, the South Korean company disclosed plans to ship the next versions of HBM chips - the 12-layer HBM3E - starting in the fourth quarter and the HBM4 starting in the second half of 2025. High bandwidth memory (HBM) is a type of dynamic random access memory, or DRAM, standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. They are advanced memory chips capable of handling generative artificial intelligence (AI) work. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. In May, SK Hynix CEO Kwak Noh-Jung said its HBM chips were sold out for this year and almost sold out for 2025. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung Electronics. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. (Reporting by Heekyong Yang and Ben Blanchard; Editing by Jacqueline Wong and Christian Schmollinger)
[2]
SK Hynix to Start Mass Producing HBM3E 12-Layer Chips This Month
TAIPEI (Reuters) - SK Hynix, the world's second-largest memory chip maker, will start mass production of HBM3E 12-layer chips by the end of this month, a senior executive said on Wednesday. Justin Kim, president and head of the company's AI Infra division, made the comment at the Semicon Taiwan industry forum in Taipei. In July, the South Korean company disclosed plans to ship the next versions of HBM chips - the 12-layer HBM3E - starting in the fourth quarter and the HBM4 starting in the second half of 2025. High bandwidth memory (HBM) is a type of dynamic random access memory, or DRAM, standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. They are advanced memory chips capable of handling generative artificial intelligence (AI) work. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. In May, SK Hynix CEO Kwak Noh-Jung said its HBM chips were sold out for this year and almost sold out for 2025. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung Electronics. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. (Reporting by Heekyong Yang and Ben Blanchard; Editing by Jacqueline Wong and Christian Schmollinger)
[3]
SK Hynix to start mass producing HBM3E 12-layer chips this month
In July, the South Korean company disclosed plans to ship the next versions of HBM chips - the 12-layer HBM3E - starting in the fourth quarter and the HBM4 starting in the second half of 2025. High bandwidth memory (HBM) is a type of dynamic random access memory, or DRAM, standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. They are advanced memory chips capable of handling generative artificial intelligence (AI) work. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. In May, SK Hynix CEO Kwak Noh-Jung said its HBM chips were sold out for this year and almost sold out for 2025. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung Electronics. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. (Reporting by Heekyong Yang and Ben Blanchard; Editing by Jacqueline Wong and Christian Schmollinger)
Share
Share
Copy Link
SK Hynix, a leading South Korean chipmaker, announces plans to start mass production of advanced HBM3E 12-layer memory chips this month, aiming to meet the growing demand for AI applications.
SK Hynix, South Korea's second-largest chipmaker, has announced a significant advancement in memory chip technology. The company is set to begin mass production of its cutting-edge HBM3E 12-layer memory chips this month 1. This move marks a crucial step in meeting the escalating demand for high-performance computing and artificial intelligence applications.
The HBM3E (High Bandwidth Memory 3rd generation Extended) chips boast impressive specifications. Each chip can process up to 1.18 terabytes of data per second, equivalent to transmitting 118 Full HD movies in just one second 2. This represents a 53% increase in data processing speed compared to its predecessor, the HBM3. The new chips also offer a 10% reduction in power consumption, addressing the growing need for energy-efficient solutions in data centers and AI applications.
SK Hynix's announcement comes at a critical time in the semiconductor industry. The company is positioning itself to capitalize on the surging demand for AI chips, particularly in the wake of the success of ChatGPT and other AI models 3. This move is expected to intensify competition with other major players in the memory chip market, including Samsung Electronics and Micron Technology.
The mass production of these advanced chips is slated to begin in September 2024. SK Hynix has stated that it plans to expand the proportion of HBM3E production to over 50% of its total HBM production by the end of the year 1. This aggressive production schedule underscores the company's commitment to meeting the growing market demand and maintaining its competitive edge in the high-performance memory segment.
The introduction of HBM3E 12-layer chips is expected to have a significant impact on the AI and high-performance computing sectors. These chips are crucial components in AI accelerators and graphics processing units (GPUs) used in training large language models and other AI applications. The increased speed and efficiency of these chips could potentially accelerate AI development and enable more complex and powerful AI models in the future 2.
As the demand for AI-capable hardware continues to grow, SK Hynix's move is likely to spur further innovation in the memory chip industry. Analysts predict that this development could lead to increased competition and potentially faster advancements in memory technology. The success of these new chips could also influence the direction of future research and development in the semiconductor industry, particularly in areas related to AI and high-performance computing 3.
Reference
[1]
[2]
SK Hynix has started mass production of its cutting-edge 12-layer HBM3E memory modules, offering 36GB capacity per module and speeds up to 9.6 Gbps. This breakthrough is set to revolutionize high-performance computing and AI applications.
9 Sources
9 Sources
SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.
3 Sources
3 Sources
SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.
12 Sources
12 Sources
SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.
5 Sources
5 Sources
Samsung Electronics has successfully cleared Nvidia's tests for its 8-layer High Bandwidth Memory 3E (HBM3E) chips. This breakthrough could lead to significant advancements in AI chip technology and strengthen Samsung's position in the memory chip market.
4 Sources
4 Sources