Curated by THEOUTPOST
On Wed, 7 Aug, 8:01 AM UTC
4 Sources
[1]
Samsung's 8-layer HBM3E chips clear Nvidia's tests for use, sources say
Samsung and Nvidia have yet to sign a supply deal for the approved eight-layer HBM3E chips but will do so soon, the sources said, adding that they expect supplies would start by the fourth quarter of 2024. The latest test approval follows Nvidia's recent certification of Samsung's HBM3 chips for use in less sophisticated processors developed for the Chinese market, which Reuters reported last month.A version of Samsung Electronics' fifth-generation high bandwidth memory (HBM) chips, or HBM3E, has passed Nvidia's tests for use in its artificial intelligence (AI) processors, three sources briefed on the results said. The qualification clears a major hurdle for the world's biggest memory chipmaker which has been struggling to catch up with local rival SK Hynix in the race to supply the advanced memory chips capable of handling generative AI work. Samsung and Nvidia have yet to sign a supply deal for the approved eight-layer HBM3E chips but will do so soon, the sources said, adding that they expect supplies would start by the fourth quarter of 2024. The South Korean technology giant's 12-layer version of HBM3E chips, however, has yet to pass Nvidia's tests, the sources said, declining to be identified as the matter remains confidential. Both Samsung and Nvidia declined to comment. HBM is a type of dynamic random access memory or DRAM standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. Samsung has been seeking to pass Nvidia's tests for HBM3E and preceding fourth-generation HBM3 models since last year but has struggled due to heat and power consumption issues, reported in May, citing sources. The company has since reworked its HBM3E design to address those issues, according to the sources who were briefed on the matter. Samsung said after the publication of the Reuters article in May that claims its chips had failed Nvidia's tests due to heat and power consumption problems were untrue. The latest test approval follows Nvidia's recent certification of Samsung's HBM3 chips for use in less sophisticated processors developed for the Chinese market, which Reuters reported last month. Nvidia's approval of Samsung's latest HBM chips comes amid soaring demand for sophisticated GPUs created by the generative AI boom that Nvidia and other makers of AI chipsets are struggling to meet. HBM3E chips are likely to become the mainstream HBM product in the market this year with shipments concentrated in the second half, according to research firm TrendForce. SK Hynix, the leading manufacturer, estimates demand for HBM memory chips in general could increase at an annual rate of 82% through 2027. Samsung forecast in July that HBM3E chips would make up 60% of its HBM chip sales by the fourth-quarter, a target that many analysts say could be achieved if its latest HBM chips passed Nvidia's final approval by the third quarter. Samsung does not provide revenue breakdowns for specific chip products. Samsung's total DRAM chip revenue was estimated at 22.5 trillion won ($16.4 billion) for the first six months of this year, according to a Reuters' survey of 15 analysts, and some said about 10% of that could be from HBM sales. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. Shipments went to Nvidia, sources had said earlier. Micron has also said it will supply Nvidia with HBM3E chips. ($1 = 1,375.6400 won)
[2]
Samsung's 8-layer HBM3E chips clear Nvidia's tests for use, sources say
SINGAPORE/SEOUL Aug 7 - A version of Samsung Electronics' fifth-generation high bandwidth memory (HBM) chips, or HBM3E, has passed Nvidia's tests for use in its artificial intelligence (AI) processors, three sources briefed on the results said. The qualification clears a major hurdle for the world's biggest memory chipmaker which has been struggling to catch up with local rival SK Hynix in the race to supply the advanced memory chips capable of handling generative AI work. Samsung and Nvidia have yet to sign a supply deal for the approved eight-layer HBM3E chips but will do so soon, the sources said, adding that they expect supplies would start by the fourth quarter of 2024. The South Korean technology giant's 12-layer version of HBM3E chips, however, has yet to pass Nvidia's tests, the sources said, declining to be identified as the matter remains confidential. Both Samsung and Nvidia declined to comment. HBM is a type of dynamic random access memory or DRAM standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. Samsung has been seeking to pass Nvidia's tests for HBM3E and preceding fourth-generation HBM3 models since last year but has struggled due to heat and power consumption issues, Reuters reported in May, citing sources. The company has since reworked its HBM3E design to address those issues, according to the sources who were briefed on the matter. Samsung said after the publication of the Reuters article in May that claims its chips had failed Nvidia's tests due to heat and power consumption problems were untrue. The latest test approval follows Nvidia's recent certification of Samsung's HBM3 chips for use in less sophisticated processors developed for the Chinese market, which Reuters reported last month. Nvidia's approval of Samsung's latest HBM chips comes amid soaring demand for sophisticated GPUs created by the generative AI boom that Nvidia and other makers of AI chipsets are struggling to meet. HBM3E chips are likely to become the mainstream HBM product in the market this year with shipments concentrated in the second half, according to research firm TrendForce. SK Hynix, the leading manufacturer, estimates demand for HBM memory chips in general could increase at an annual rate of 82% through 2027. Samsung forecast in July that HBM3E chips would make up 60% of its HBM chip sales by the fourth-quarter, a target that many analysts say could be achieved if its latest HBM chips passed Nvidia's final approval by the third quarter. Samsung does not provide revenue breakdowns for specific chip products. Samsung's total DRAM chip revenue was estimated at 22.5 trillion won for the first six months of this year, according to a Reuters' survey of 15 analysts, and some said about 10% of that could be from HBM sales. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. Shipments went to Nvidia, sources had said earlier. Micron has also said it will supply Nvidia with HBM3E chips. ($1 = 1,375.6400 won) (Reporting by Fanny Potkin in Singapore and Heekyong Yang in Seoul; Editing by Miyoung Kim and Miral Fahmy)
[3]
Samsung's 8-layer HBM3E chips clear Nvidia's tests for use, Reuters reports
A version of Samsung Electronics' fifth-generation high bandwidth memory chips, or HBM3E, has passed Nvidia's tests for use in its artificial intelligence processors, three sources briefed on the results said. The qualification clears a major hurdle for the world's biggest memory chipmaker which has been struggling to catch up with local rival SK Hynix in the race to supply the advanced memory chips capable of handling generative AI work. Samsung and Nvidia have yet to sign a supply deal for the approved eight-layer HBM3E chips but will do so soon, the sources said, adding that they expect supplies would start by the fourth quarter of 2024. The South Korean technology giant's 12-layer version of HBM3E chips, however, has yet to pass Nvidia's tests, the sources said, declining to be identified as the matter remains confidential. Both Samsung and Nvidia declined to comment. HBM is a type of dynamic random access memory or DRAM standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. A key component of graphics processing units for AI, it helps process massive amounts of data produced by complex applications.
[4]
Exclusive-Samsung's 8-layer HBM3E chips clear Nvidia's tests for use, sources say
Samsung and Nvidia have yet to sign a supply deal for the approved eight-layer HBM3E chips but will do so soon, the sources said, adding that they expect supplies would start by the fourth quarter of 2024. The South Korean technology giant's 12-layer version of HBM3E chips, however, has yet to pass Nvidia's tests, the sources said, declining to be identified as the matter remains confidential. HBM is a type of dynamic random access memory or DRAM standard first produced in 2013 in which chips are vertically stacked to save space and reduce power consumption. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications. Samsung has been seeking to pass Nvidia's tests for HBM3E and preceding fourth-generation HBM3 models since last year but has struggled due to heat and power consumption issues, Reuters reported in May, citing sources. The company has since reworked its HBM3E design to address those issues, according to the sources who were briefed on the matter. Samsung said after the publication of the Reuters article in May that claims its chips had failed Nvidia's tests due to heat and power consumption problems were untrue. The latest test approval follows Nvidia's recent certification of Samsung's HBM3 chips for use in less sophisticated processors developed for the Chinese market, which Reuters reported last month. Nvidia's approval of Samsung's latest HBM chips comes amid soaring demand for sophisticated GPUs created by the generative AI boom that Nvidia and other makers of AI chipsets are struggling to meet. HBM3E chips are likely to become the mainstream HBM product in the market this year with shipments concentrated in the second half, according to research firm TrendForce. SK Hynix, the leading manufacturer, estimates demand for HBM memory chips in general could increase at an annual rate of 82% through 2027. Samsung forecast in July that HBM3E chips would make up 60% of its HBM chip sales by the fourth-quarter, a target that many analysts say could be achieved if its latest HBM chips passed Nvidia's final approval by the third quarter. Samsung does not provide revenue breakdowns for specific chip products. Samsung's total DRAM chip revenue was estimated at 22.5 trillion won ($16.4 billion) for the first six months of this year, according to a Reuters' survey of 15 analysts, and some said about 10% of that could be from HBM sales. There are only three main manufacturers of HBM - SK Hynix, Micron and Samsung. SK Hynix has been the main supplier of HBM chips to Nvidia and supplied HBM3E chips in late March to a customer it declined to identify. Shipments went to Nvidia, sources had said earlier. Micron has also said it will supply Nvidia with HBM3E chips. (Reporting by Fanny Potkin in Singapore and Heekyong Yang in Seoul; Editing by Miyoung Kim and Miral Fahmy)
Share
Share
Copy Link
Samsung Electronics has successfully cleared Nvidia's tests for its 8-layer High Bandwidth Memory 3E (HBM3E) chips. This breakthrough could lead to significant advancements in AI chip technology and strengthen Samsung's position in the memory chip market.
Samsung Electronics has achieved a significant milestone in the development of advanced memory chips for artificial intelligence applications. The South Korean tech giant's 8-layer High Bandwidth Memory 3E (HBM3E) chips have successfully passed Nvidia's stringent tests, according to sources familiar with the matter 1.
This breakthrough is expected to have far-reaching implications for the AI chip industry. The HBM3E chips are designed to process vast amounts of data at high speeds, which is crucial for AI applications. With Nvidia's approval, Samsung is now positioned to supply these advanced chips for use in Nvidia's next-generation AI graphics cards 2.
Samsung's success in this area puts it in direct competition with SK Hynix, which had previously been the sole supplier of HBM3E chips to Nvidia. The approval of Samsung's chips is likely to intensify competition in the memory chip market, potentially leading to improved products and pricing for consumers 3.
The 8-layer HBM3E chips represent a significant advancement over the previous 12-layer HBM3 chips. These new chips offer improved heat dissipation and signal transmission, which are critical factors in high-performance computing and AI applications. Samsung plans to begin mass production of these chips in the first half of 2024 4.
This development is expected to strengthen Samsung's position in the memory chip market. As the demand for AI-capable hardware continues to grow, Samsung's ability to supply these advanced chips to major players like Nvidia could lead to significant revenue growth and market share expansion 1.
The successful testing of Samsung's HBM3E chips reflects the ongoing trend towards more powerful and efficient memory solutions for AI and high-performance computing. As companies like Nvidia continue to push the boundaries of AI technology, the demand for advanced memory chips is expected to grow, potentially reshaping the semiconductor industry landscape 3.
Reference
[1]
[2]
Nvidia has given the green light to use Samsung's HBM3 memory chips in its AI processors designed for the Chinese market. This move comes amidst ongoing US-China tech tensions and could potentially boost Samsung's market position.
10 Sources
10 Sources
NVIDIA is working rapidly to certify Samsung's HBM3E memory chips for its AI GPUs, potentially diversifying its supply chain beyond current major suppliers SK hynix and Micron.
2 Sources
2 Sources
SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.
12 Sources
12 Sources
SK Hynix, a leading South Korean chipmaker, announces plans to start mass production of advanced HBM3E 12-layer memory chips this month, aiming to meet the growing demand for AI applications.
3 Sources
3 Sources
SK Hynix has started mass production of its cutting-edge 12-layer HBM3E memory modules, offering 36GB capacity per module and speeds up to 9.6 Gbps. This breakthrough is set to revolutionize high-performance computing and AI applications.
9 Sources
9 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved