5 Sources
5 Sources
[1]
Samsung Nears Nvidia's Approval for Key HBM4 AI Memory Chips
Samsung trails SK Hynix and Micron Technology Inc. at the forefront of AI memory, but investor hopes are rising that Samsung may be able to join its rivals in supplying components for Nvidia's upcoming flagship Rubin processors. Samsung Electronics Co. is getting close to securing certification from Nvidia Corp. for the latest version of its AI memory chip, called HBM4, making progress in narrowing the gap with rival SK Hynix Inc. The Suwon, South Korea-based company has entered the final qualification phase with Nvidia, after supplying its initial samples to the US chipmaker in September, according to people familiar with the matter. Nvidia uses vast quantities of high-bandwidth memory, or HBM, to enable its AI accelerators. Samsung is preparing for mass production of HBM4 in February, said the people, who asked not to be named as the information is private. The company will be ready to ship soon, though exact timing is not yet clear. Its shares gained as much as 3.2% in Seoul on Monday before paring gains, while SK Hynix's stock slid by about the same amount. A Samsung representative declined to comment. Samsung trails SK Hynix and Micron Technology Inc. at the forefront of AI memory, but all three companies have seen their shares surge dramatically in recent weeks as the AI rush has produced a shortage of memory for the broader electronics industry. Between them, the three leading memory chip manufacturers have gained roughly $900 billion in market value since the start of September. Investor hopes are now rising that Samsung may be able to join its rivals in supplying components for Nvidia's upcoming flagship Rubin processors. So far, Nvidia has leaned most heavily on SK Hynix for the most sophisticated memory chips it pairs with its top-of-the-line AI accelerators. The Korea Economic Daily reportedBloomberg Terminal earlier Samsung is slated to begin HBM4 shipments to Nvidia and Advanced Micro Devices Inc. next month. Both Samsung and SK Hynix are slated to hold their earnings calls on Thursday, and they are expected to discuss progress on their HBM4 chips.
[2]
Samsung should be first with HBM4 powering NVIDIA's new Vera Rubin AI chips, passed all tests
TL;DR: Samsung will unveil its next-generation HBM4 memory at NVIDIA GTC 2026, featuring speeds up to 11Gbps, surpassing JEDEC standards. Integrated into NVIDIA's Vera Rubin AI platform, Samsung's advanced HBM4 offers high bandwidth and energy efficiency, accelerating AI development and strengthening semiconductor innovation globally. Samsung will debut its next-gen HBM4 memory at NVIDIA GTC 2026 in March, reportedly passing all of NVIDIA's strict verification stages, and will arrive on NVIDIA's next-gen Vera Rubin AI platform. Samsung has spent the last couple of years struggling with its HBM memory division, leaving its South Korean rival -- SK hynix -- to enjoy providing NVIDIA with all of its HBM3 and HBM3E needs. Samsung completely overhauled its HBM and semiconductor division in the last few years, with the fruits of that labor now showing. NVIDIA will reportedly use its first allotments of HBM4 memory for Vera Rubin from Samsung, as Samsung's new HBM4 memory is the best of the HBM4 offerings from its rivals in SK hynix and US-based Micron. Samsung's new HBM4 memory is rated for above 11Gbps, much higher than JEDEC standards for HBM4, and was pushed and requested at those higher pin speeds from NVIDIA direct. Samsung and NVIDIA are working together on HBM4, with Samsung explaining on its press release that with incredibly high bandwidth and energy efficiency, Samsung's advanced HBM solutions are expected to help accelerate the development of future AI applications, and form a critical foundation for manufacturing infrastructure driven by these technologies. The company is using its 6th-generation 10nm-class DRAM and a 4nm logic base die, with Samsung's upcoming HBM4 processing speeds reaching up to 11Gbps, far exceeding the JEDEC standard for HBM4 at 8Gbps. Samsung will also continue to deliver next-generation memory solutions, including HBM, GDDR, and SOCAMM memory, as well as foundry services, driving innovation and scalability across the global AI value chain.
[3]
Nvidia Supply Deal In Crosshairs, Samsung Set To Start HBM4 Chip Production Next Month As Korean Giant Takes On SK Hynix: Report - Advanced Micro Devices (NASDAQ:AMD), NVIDIA (NASDAQ:NVDA)
Samsung Electronics Co. (OTC:SSNLF) is reportedly preparing to begin production of its next-generation high-bandwidth memory chips as it seeks to supply Nvidia Corp (NASDAQ:NVDA) and narrow the gap with SK Hynix. Samsung Pushes Into Next-Gen AI Memory Samsung plans to start manufacturing HBM4 as early as next month and is expected to supply it to Nvidia, reported Reuters, citing a person familiar with the matter. The move marks a key step in Samsung's efforts to regain momentum in the high-bandwidth memory market after production delays weighed on its earnings and share price last year. HBM chips are essential components for advanced AI accelerators, an area where demand has surged alongside the rapid expansion of generative AI. While the source declined to disclose shipment volumes or contract specifics, South Korea's Korea Economic Daily reported that Samsung has passed HBM4 qualification tests for both Nvidia and Advanced Micro Devices, Inc. (NASDAQ:AMD). The company is preparing to begin shipments to Nvidia next month, the report said. SK Hynix Defends Its Lead In HBM Samsung's main rival, SK Hynix, currently dominates the HBM market and has been the primary supplier of advanced memory chips for Nvidia's AI processors. The company said in October that it had completed supply negotiations with major customers for next year. SK Hynix is also expanding production capacity. An executive told the publication earlier this month that the company will begin deploying silicon wafers into its new M15X fabrication plant in Cheongju, South Korea, next month to produce HBM chips. However, it remains unclear whether HBM4 will be included in the initial output. Nvidia's Next-Gen AI Chips Raise Stakes The timing is critical as Nvidia prepares to launch its next-generation AI platform, Vera Rubin, later this year. Nvidia CEO Jensen Huang said earlier this month that the platform is already in full production and will be paired with HBM4 memory. Samsung, SK Hynix Earnings In Focus As Memory Chip Price Hikes Both Samsung and SK Hynix are scheduled to report fourth-quarter earnings on Thursday. Samsung has raised prices on its major memory chips by as much as 60% since September 2025. The company also acted swiftly this week after rumors circulated of an unprecedented 80% price increase across its entire memory product lineup. Taiwan's United Daily News reported that Samsung and several memory module manufacturing partners have stated that the 80% figure is completely false. Nvidia maintains a stronger price trend over the short, medium and long terms with a poor value ranking. Additional performance details, as per Benzinga's Edge Stock Rankings. Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Photo Courtesy: Grand Warszawski on Shutterstock.com AMDAdvanced Micro Devices Inc $258.82-0.33% Overview NVDANVIDIA Corp $187.25-0.22% SSNLFSamsung Electronics Co Ltd $42.48-% Market News and Data brought to you by Benzinga APIs
[4]
Samsung Set to Be Among the First to Feature HBM4 in NVIDIA's Vera Rubin AI Lineup, Having Reportedly Passed All Verification Stages
Samsung's HBM4 modules are expected to be featured in NVIDIA's Vera Rubin AI lineup as soon as June, as the company sees a massive breakthrough with its memory business. With Industry's Fastest Pin Speeds, Samsung's HBM4 Modules Have Taken a Lead Over Counterparts HBM4 is known as a 'revolutionary' offering from memory giants, thanks to the innovations it brings to the module. We'll discuss HBM4 in detail ahead, but Samsung has actually managed to 'turn the tables' with its HBM business, as the company was struggling to acquire customers a few quarters ago and even faced rejection from NVIDIA. However, according to recent reports from Korean media, Samsung's HBM4 modules are now the first in line for adoption by NVIDIA for its Vera Rubin AI lineup, with supply expected as soon as next month. There are several reasons why Samsung's HBM4 process stands out from the rest, but one of the biggest differentiators is the Korean giant's offer of the highest pin speeds. Samsung's HBM4 is rated at 11 Gbps+, which is much higher than the JEDEC standard, mainly because it was a key requirement from NVIDIA. With agentic AI being the next big avenue, Vera Rubin has seen a massive upgrade in memory specifications, and this is mainly driven by the integration of Samsung's HBM4 modules, which feature superior speeds and interface width. Another interesting point with Samsung's HBM4 is that the firm employs a logic base die (4nm) that is sourced from the company's internal foundry, and this gives them the room to guarantee NVIDIA supply with adequate delivery timings, relative to SK hynix and Micron, which plans to source their logic dies from TSMC. Considering how quickly NVIDIA has brought in Vera Rubin into "full production", it is important for suppliers to keep up the pace, and Samsung has apparently done just that. It is diclosed that customer shipments around Vera Rubin start from August, and that Rubin AI chips will be displayed entirely at GTC 2026, where Samsung's HBM4 module will also see the spotlight. Follow Wccftech on Google to get more of our news coverage in your feeds.
[5]
Samsung to start HBM4 deliveries to Nvidia next month, reports show By Investing.com
Investing.com-- Samsung Electronics (KS:005930) is set to begin shipping its next-generation HBM4 high-bandwidth memory chips to Nvidia (NASDAQ:NVDA) in February, ahead of several rivals, South Korean media reported on Monday, citing industry sources. The Korea Economic Daily said Samsung has passed qualification tests for its sixth-generation HBM4 with Nvidia and Advanced Micro Devices and will start shipments next month, marking an early foray into the supply of advanced AI memory. Samsung, long trying to catch up with market leader SK Hynix (KS:000660) in high-performance memory, plans production of HBM4 from next month, reports showed. HBM4 chips are critical components for high-performance artificial intelligence accelerators, offering significant gains in data bandwidth and efficiency. Seoul-listed Samsung shares traded flat at the time of writing, after rising as much as 2.8% to 156,400 won earlier in the day, just below record highs of 157,000 won.
Share
Share
Copy Link
Samsung has entered the final qualification phase with Nvidia for its next-generation HBM4 AI memory chips, preparing for mass production in February. The Korean tech giant's HBM4 offers speeds up to 11Gbps, exceeding industry standards, and is set to power Nvidia's upcoming Vera Rubin AI platform, marking a significant comeback after trailing rivals SK Hynix and Micron.
Samsung has made substantial progress in securing certification from Nvidia for its next-generation HBM4 memory chips, entering the final qualification phase after supplying initial samples in September
1
. The Suwon-based company is preparing for mass production of HBM4 in February, positioning itself to join rivals SK Hynix and Micron in supplying components for advanced AI accelerators1
. Samsung's shares gained as much as 3.2% in Seoul following the news, while SK Hynix's stock declined by a similar margin1
.
Source: Benzinga
The development represents a critical turning point for Samsung, which has struggled in the high-bandwidth memory market over recent years. According to multiple reports, Samsung has passed all verification stages for both Nvidia and Advanced Micro Devices, with shipments expected to begin as early as next month
3
5
. The Korea Economic Daily reported that Samsung has successfully completed qualification tests, marking an early foray into supplying HBM4 AI memory chips for next-generation platforms5
.Samsung's HBM4 modules deliver pin speeds up to 11Gbps, significantly surpassing the JEDEC standard of 8Gbps
2
4
. These enhanced speeds were specifically requested by Nvidia to meet the demanding requirements of its Vera Rubin platform, which CEO Jensen Huang confirmed is already in full production3
. The company utilizes 6th-generation 10nm-class DRAM technology paired with a 4nm logic base die, providing superior bandwidth and energy efficiency for AI applications2
.
Source: TweakTown
A key competitive advantage lies in Samsung's internal foundry sourcing for the logic base die, enabling the company to guarantee adequate delivery timings compared to SK Hynix and Micron, which plan to source their logic dies from TSMC
4
. Samsung plans to unveil its HBM4 memory at GTC 2026 in March, where the Vera Rubin AI lineup will be displayed2
. Customer shipments around Vera Rubin are expected to start from August, with Samsung's modules potentially featured as soon as June4
.Related Stories
Samsung trails SK Hynix and Micron at the forefront of AI memory, but investor hopes are rising that the company may join its rivals in supplying components for Nvidia's flagship processors
1
. The three leading memory chip manufacturers have gained roughly $900 billion in market value since early September, driven by an AI-fueled shortage of memory across the broader electronics industry1
. SK Hynix currently dominates the high-bandwidth memory market and has been Nvidia's primary supplier for HBM3 and HBM3E chips, with the company completing supply negotiations with major customers and expanding production capacity at its M15X fabrication plant in Cheongju3
.
Source: Wccftech
Samsung has raised prices on its major memory chips by as much as 60% since September 2025, though the company quickly refuted rumors of an 80% price increase across its entire memory product lineup
3
. Both Samsung and SK Hynix are scheduled to report fourth-quarter earnings, where they are expected to discuss progress on their HBM4 chips and provide insights into production timelines1
. The timing proves critical as demand for HBM chips continues to surge alongside the rapid expansion of generative AI and agentic AI applications, with Nvidia using vast quantities of high-bandwidth memory to enable its AI accelerators1
4
.Summarized by
Navi
[2]
30 Oct 2025•Business and Economy

19 Sept 2025•Technology

07 Aug 2024

1
Policy and Regulation

2
Technology

3
Technology
