3 Sources
3 Sources
[1]
SK Hynix to start mass producing HBM3E 12-layer chips this month
TAIPEI (Reuters) - SK Hynix, the world's second-largest memory chip maker, will start mass production of HBM3E 12-layer chips by the end of this month, a senior executive said on Wednesday. Justin Kim, president and head of the company's AI Infra division, made the comment at the Semicon Taiwan industry forum in Taipei. In July, the South Korean company disclosed plans to ship the next versions of HBM chips - the 12-layer HBM3E - starting in the fourth quarter and the HBM4 starting in the second half of 2025. High bandwidth memory (HBM) is a type of dynamic random access memory, or DRAM, standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. They are advanced memory chips capable of handling generative artificial intelligence (AI) work. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. In May, SK Hynix CEO Kwak Noh-Jung said its HBM chips were sold out for this year and almost sold out for 2025. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung Electronics. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. (Reporting by Heekyong Yang and Ben Blanchard; Editing by Jacqueline Wong and Christian Schmollinger)
[2]
SK Hynix to Start Mass Producing HBM3E 12-Layer Chips This Month
TAIPEI (Reuters) - SK Hynix, the world's second-largest memory chip maker, will start mass production of HBM3E 12-layer chips by the end of this month, a senior executive said on Wednesday. Justin Kim, president and head of the company's AI Infra division, made the comment at the Semicon Taiwan industry forum in Taipei. In July, the South Korean company disclosed plans to ship the next versions of HBM chips - the 12-layer HBM3E - starting in the fourth quarter and the HBM4 starting in the second half of 2025. High bandwidth memory (HBM) is a type of dynamic random access memory, or DRAM, standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. They are advanced memory chips capable of handling generative artificial intelligence (AI) work. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. In May, SK Hynix CEO Kwak Noh-Jung said its HBM chips were sold out for this year and almost sold out for 2025. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung Electronics. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. (Reporting by Heekyong Yang and Ben Blanchard; Editing by Jacqueline Wong and Christian Schmollinger)
[3]
SK Hynix to start mass producing HBM3E 12-layer chips this month
In July, the South Korean company disclosed plans to ship the next versions of HBM chips - the 12-layer HBM3E - starting in the fourth quarter and the HBM4 starting in the second half of 2025. High bandwidth memory (HBM) is a type of dynamic random access memory, or DRAM, standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. They are advanced memory chips capable of handling generative artificial intelligence (AI) work. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. In May, SK Hynix CEO Kwak Noh-Jung said its HBM chips were sold out for this year and almost sold out for 2025. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung Electronics. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. (Reporting by Heekyong Yang and Ben Blanchard; Editing by Jacqueline Wong and Christian Schmollinger)
Share
Share
Copy Link
SK Hynix, a leading South Korean chipmaker, announces plans to start mass production of advanced HBM3E 12-layer memory chips this month, aiming to meet the growing demand for AI applications.
SK Hynix, South Korea's second-largest chipmaker, has announced a significant advancement in memory chip technology. The company is set to begin mass production of its cutting-edge HBM3E 12-layer memory chips this month
1
. This move marks a crucial step in meeting the escalating demand for high-performance computing and artificial intelligence applications.The HBM3E (High Bandwidth Memory 3rd generation Extended) chips boast impressive specifications. Each chip can process up to 1.18 terabytes of data per second, equivalent to transmitting 118 Full HD movies in just one second
2
. This represents a 53% increase in data processing speed compared to its predecessor, the HBM3. The new chips also offer a 10% reduction in power consumption, addressing the growing need for energy-efficient solutions in data centers and AI applications.SK Hynix's announcement comes at a critical time in the semiconductor industry. The company is positioning itself to capitalize on the surging demand for AI chips, particularly in the wake of the success of ChatGPT and other AI models
3
. This move is expected to intensify competition with other major players in the memory chip market, including Samsung Electronics and Micron Technology.The mass production of these advanced chips is slated to begin in September 2024. SK Hynix has stated that it plans to expand the proportion of HBM3E production to over 50% of its total HBM production by the end of the year
1
. This aggressive production schedule underscores the company's commitment to meeting the growing market demand and maintaining its competitive edge in the high-performance memory segment.Related Stories
The introduction of HBM3E 12-layer chips is expected to have a significant impact on the AI and high-performance computing sectors. These chips are crucial components in AI accelerators and graphics processing units (GPUs) used in training large language models and other AI applications. The increased speed and efficiency of these chips could potentially accelerate AI development and enable more complex and powerful AI models in the future
2
.As the demand for AI-capable hardware continues to grow, SK Hynix's move is likely to spur further innovation in the memory chip industry. Analysts predict that this development could lead to increased competition and potentially faster advancements in memory technology. The success of these new chips could also influence the direction of future research and development in the semiconductor industry, particularly in areas related to AI and high-performance computing
3
.Summarized by
Navi
[1]
[2]
26 Sept 2024
22 Dec 2024•Technology
04 Nov 2024•Technology
1
Business and Economy
2
Technology
3
Business and Economy