5 Sources
5 Sources
[1]
Samsung readying mass production of next-gen HBM4 memory in 2026, 24Gb GDDR7 dies, 128GB+ DDR5
TL;DR: Samsung will begin mass production of next-gen 1c DRAM-based HBM4 memory, 24Gb GDDR7 dies, and 128GB+ DDR5 products in 2026, targeting AI and server markets. The company reported record Q3 2025 memory sales driven by strong AI demand and plans to expand 2nm GAA chip production and fab capacity. Samsung has just announced it will begin mass production of its new 1c DRAM-based next-gen HBM4 memory, as well as new 24Gb GDDR7 memory dies, and 128GB+ DDR5 products in 2026. The company announced its recent Q3 2025 earnings, with a 15.4% increase in revenue over Q2 2025, with a new all-time high in quarterly sales from its Memory business, thanks to strong demand for HBM3E memory and server SSDs in the continuing wave of AI demand. Samsung recently showed off its new HBM4 memory, offering up to 11Gbps bandwidth per IC, and should be featured inside of the next-generation of AI hardware from NVIDIA and AMD in the upcoming Rubin and Instinct MI400 series AI GPUs. Samsung is aiming to have a stable supply of its next-gen 2nm GAA (Gate-All-Around) production, and the new HBM4 base die, in 2026. Samsung should be testing out its run of 2nm GAA chips through its semiconductor foundry ahead of fabbing next-gen Exynos SoCs and Qualcomm's next-gen Snapdragon processors, with 2nm GAA ramping production this quarter. Samsung explains in the press release: "In Q4 2025, the Business will actively respond to demand from AI and conventional servers with HBM3E, high-density eSSDs and other leading-edge memory offerings. Additionally, it will continue to expand sales of industry-leading, high-value-added server memory products, such as 128GB and higher DDR5, as well as 24Gb GDDR7". "Going forward in 2026, the Memory Business will focus on the mass production of HBM4 products with differentiated performance, while simultaneously aiming to scale out the HBM sales base. In particular, demand for HBM4 is also projected to increase, and the Company plans to proactively respond with capacity expansion in 1c. It will also concentrate on expanding sales of other high-value-added products, such as DDR5, LPDDR5x and high-density QLC SSDs to meet demand for AI applications". "In Q4 2025, the Business will aim for continued earnings improvement by ramping up mass production of 2nm Gate-All-Around (GAA) products, increasing fab utilization, and optimizing costs. In 2026, the Foundry Business will focus on providing a stable supply of new 2nm GAA products and the HBM4 base-die, and beginning operations at the Company's fab in Taylor, Texas in a timely manner".
[2]
Samsung Electronics says it is in talks with Nvidia to supply next-generation HBM4 chips
Samsung, which plans to market the new chip next year, did not specify when it aims to ship the latest version of its HBM chip, a key building block of artificial intelligence chipsets. Samsung Electronics said on Friday it is in "close discussion" to supply its next-generation high-bandwidth memory (HBM) chips, or HBM4, to Nvidia, as the South Korean chipmaker scrambles to catch up with rivals in the AI chip race. Samsung, which plans to market the new chip next year, did not specify when it aims to ship the latest version of its HBM chip, a key building block of artificial intelligence chipsets. Local rival SK Hynix, Nvidia's top HBM chip supplier, on Wednesday said it aims to start shipping its latest HBM4 chips in the fourth quarter and expand sales next year. Nvidia, in a statement announcing cooperation with Samsung and other Korean companies, said it is in "key supply collaboration for HBM3E and HBM4", without elaborating. Samsung has been slower to capitalise on the AI-driven memory chip boom, leading to weaker earnings performance and a reshuffle of its chip division last year. Its earnings recovered this quarter, driven by conventional memory chip demand. This week it said it sells its current-generation HBM3E chips to "all related customers", indicating it has joined rivals in supplying the latest 12-layer HBM3E chips to Nvidia. The launch of HBM4 chips will be a major test of Samsung's ability to regain its edge in the market, analysts said. HBM - a type of dynamic random access memory (DRAM) standard first produced in 2013 - involves stacking chips vertically to save space and reduce power consumption, helping to process the large volume of data generated by complex AI applications.
[3]
Samsung Strikes a Crucial Deal with NVIDIA For Next-Gen HBM4 AI Memory, Validating Its Process as the Industry's Fastest With 11 Gbps Speeds
Samsung has secured a crucial 'HBM4 supply' deal with NVIDIA, as both firms announce their collaboration on cutting-edge technologies to drive the AI hype. The Korean giant has achieved a massive feat with its upcoming HBM4 technology, as the firm becomes one of the first manufacturers to secure a supply for NVIDIA. In an announcement around the recent Samsung-NVIDIA deal, it is revealed that HBM4 is going to be a crucial part of the partnership, and more importantly, the Korean giant has verified that they are working with NVIDIA on HBM4, which means that Samsung has managed to achieve an early lead with the process, marking a considerable achievement for the firm's competition amongst DRAM manufacturers. In addition to their ongoing collaborations, Samsung and NVIDIA are also working together on HBM4. With incredibly high bandwidth and energy efficiency, Samsung's advanced HBM solutions are expected to help accelerate the development of future AI applications and form a critical foundation for manufacturing infrastructure driven by these technologies. Built with the company's 6th-generation 10-nanometer (nm)-class DRAM and a 4nm logic base die, Samsung HBM4's processing speeds can reach 11 gigabits-per-second (Gbps), far exceeding the JEDEC standard of 8Gbps. Well, one of the bigger reasons why Samsung's HBM4 memory has managed to take a spot in NVIDIA's supply chain is that the process is reported to be the fastest out there in terms of processing speeds, reaching up to 11 Gbps, which is far higher than what Micron or SK hynix currently offers. We know that the Rubin AI lineup is expected to be a massive release for NVIDIA, and judging by the competition the firm faces with AMD's Instinct MI450 series, securing high-end HBM4 solutions is a primary objective, which is why Samsung has secured a key spot in NVIDIA's HBM supply chain. The announcement could prove to be a significant turning point for Samsung's HBM business, as it has been struggling for several quarters now, primarily due to the firm's slow progress with HBM3. However, in recent weeks, things have turned out incredibly in Samsung's favor, and now that the firm has seen an early lead with HBM4, the Korean giant is right back in the business. Competition within the HBM segment is expected to increase in the future, as Samsung's recent achievement will likely prompt SK hynix and Micron to restructure their HBM4 solutions.
[4]
Samsuing Preps For Mass Production On Next-Gen HBM4 Memory in 2026: 24Gb GDDR7, And 128GB+ DDR5 Products In The Plans Too
Samsung is also set to begin production of next-gen HBM4 memory, 24 Gb GDDR7 DRAM & 128 GB+ products in 2026. Samsung has announced its Q3 2025 earnings report, highlighting a 15.4% increase in revenue versus the previous quarter. The South Korean technology company posted a revenue of KRW 86.1 trillion, and also set an all-time high from quarterly sales for its Memory business, mainly driven by strong demand for its HBM3E memory and server SSDs, thanks to heightened AI momentum. Just recently, Samsung showcased its next-gen HBM4 memory solution for the first time. Offering speeds of up to 11 Gbps per IC, the next-gen solution will be a potential solution for upcoming AI accelerators by NVIDIA and AMD, such as Rubin and MI400 series. Samsung is likely to have sent out samples of its HBM solutions to AI chip makers for further evaluation and qualification testing. In addition to this, Samsung is also looking to provide a stable supply of 2nm GAA (Gate-All-Around) production & the HBM4 base die in 2026. The 2nm process from Samsung is likely going to be used for the production of the next-gen Exynos SoC / Qualcomm Snapdragon SoCs and will enter ramp this quarter. In Q4 2025, the Business will actively respond to demand from AI and conventional servers with HBM3E, high-density eSSDs and other leading-edge memory offerings. Additionally, it will continue to expand sales of industry-leading, high-value-added server memory products, such as 128GB and higher DDR5, as well as 24Gb GDDR7. Going forward in 2026, the Memory Business will focus on the mass production of HBM4 products with differentiated performance, while simultaneously aiming to scale out the HBM sales base. In particular, demand for HBM4 is also projected to increase, and the Company plans to proactively respond with capacity expansion in 1c. It will also concentrate on expanding sales of other high-value-added products, such as DDR5, LPDDR5x and high-density QLC SSDs to meet demand for AI applications. In Q4 2025, the Business will aim for continued earnings improvement by ramping up mass production of 2nm Gate-All-Around (GAA) products, increasing fab utilization, and optimizing costs. In 2026, the Foundry Business will focus on providing a stable supply of new 2nm GAA products and the HBM4 base-die, and beginning operations at the Company's fab in Taylor, Texas in a timely manner. via Samsung Newsroom As for other products, Samsung also highlighted how 128GB+ DDR5 memory and 24Gb GDDR7 DRAM will play a crucial role for the company in 2026. Future server platforms from AMD and Intel are going to launch around the second half of 2026, so a lot of action is expected. Meanwhile, GDDR7 is going to retain popularity amongst high-end consumers and AI graphics cards. NVIDIA's recently announced Rubin CPX GPU is likely going to be a potential candidate for this memory, whereas other AI and gaming products, such as the NVIDIA RTX 50 "SUPER" series, and a possible AMD Radeon "RDNA 5" or "RDNA 4" refresh, are expected to utilize GDDR7. The 24Gb DRAM dies will enable more VRAM capacity, and also fill in the gaps in the entry-mainstream segments. The core issue for the DRAM and SSD market right now is that supply is currently being focused on the AI market, leading to prices for consumer-aimed products skyrocketing. DDR5 memory and SSD prices have started to see major price hikes and shortages recently, so that is something to worry about. All major DRAM makers have already announced increased prices for DDR5/DDR4 memory, so we will see how the market evolves in the coming months.
[5]
Samsung Electronics in talks with Nvidia to supply next-generation HBM4 chips
SEOUL -- Samsung Electronics said on Friday it is in "close discussion" to supply its next-generation high-bandwidth memory (HBM) chips, or HBM4, to Nvidia, as the South Korean chipmaker scrambles to catch up with rivals in the AI chip race. Samsung, which plans to market the new chip next year, did not specify when it aims to ship the latest version of its HBM chip, a key building block of artificial intelligence chipsets. Local rival SK Hynix, Nvidia's top HBM chip supplier, on Wednesday said it aims to start shipping its latest HBM4 chips in the fourth quarter and expand sales next year. Nvidia, in a statement announcing cooperation with Samsung, said it is in "key supply collaboration for HBM3E and HBM4," without elaborating. In a separate deal, Samsung said it will purchase 50,000 high-end Nvidia chips to build an AI-enhanced semiconductor factory aimed at improving chip manufacturing speed and yields. Samsung's share price rose as much as 4.32 per cent after the announcements. Chairman Jay Y. Lee and Nvidia CEO Jensen Huang met over fried chicken and beer on Thursday during Huang's visit to Korea to attend the Asia-Pacific Economic Cooperation CEO Summit. Lee said Nvidia is a key customer and strategic partner and highlighted more than two decades of collaboration. Jeff Kim, head of research at KB Securities, said HBM4 likely needs further testing but Samsung widely is seen to be in a favorable position given its production capacity. "If Samsung supplies HBM4 chips to Nvidia, it could secure a significant market share that it was unable to achieve with previous HBM series products," Kim said. Samsung has been slower to capitalize on the AI-driven memory chip boom, leading to weaker earnings performance and a reshuffle of its chip division last year. Its earnings recovered in the latest quarter driven by conventional memory chip demand. This week it said it sells its current-generation HBM3E chips to "all related customers," indicating it has joined rivals in supplying the latest 12-layer HBM3E chips to Nvidia. The launch of HBM4 chips will be a major test of Samsung's ability to regain its edge in the market, analysts said. HBM - a type of dynamic random access memory (DRAM) standard first produced in 2013 - involves stacking chips vertically to save space and reduce power consumption, helping to process the large volume of data generated by complex AI applications. Investors are watching for whether Samsung's HBM4 can cut SK Hynix's lead in advanced memory chips. The chipmaker, which also is also a leading smartphone maker, said in July it had provided HBM4 samples to customers, with plans to begin supply next year. Samsung's share price has risen nearly 60 per cent since July as investors expect the chipmaker to benefit from the current uptrend in memory prices and advance in the AI race.
Share
Share
Copy Link
Samsung announces partnership with NVIDIA for HBM4 memory supply and plans mass production in 2026. The Korean chipmaker aims to regain market position with 11 Gbps HBM4 technology and expanded AI memory solutions.

Samsung Electronics has secured a pivotal partnership with NVIDIA for the supply of next-generation HBM4 (High Bandwidth Memory) chips, marking a significant milestone in the South Korean chipmaker's efforts to regain its competitive edge in the AI memory market
2
. The announcement comes as Samsung scrambles to catch up with rivals, particularly SK Hynix, which currently dominates NVIDIA's HBM supply chain.The partnership was formalized during NVIDIA CEO Jensen Huang's visit to Korea for the Asia-Pacific Economic Cooperation CEO Summit, where he met with Samsung Chairman Jay Y. Lee over dinner
5
. NVIDIA confirmed the collaboration in a statement, noting "key supply collaboration for HBM3E and HBM4" without providing additional details2
.Samsung's HBM4 memory represents a significant technological advancement, featuring processing speeds of up to 11 gigabits-per-second (Gbps), substantially exceeding the JEDEC standard of 8Gbps
3
. Built with the company's 6th-generation 10-nanometer-class DRAM and a 4nm logic base die, this technology positions Samsung as offering the fastest HBM4 solution currently available in the market3
.The superior performance characteristics make Samsung's HBM4 particularly attractive for NVIDIA's upcoming Rubin AI lineup and AMD's Instinct MI400 series AI GPUs
1
. This technological edge has enabled Samsung to secure an early position in NVIDIA's supply chain, marking a considerable achievement in the competitive DRAM manufacturing landscape3
.Samsung has outlined an ambitious production schedule, planning to begin mass production of HBM4 memory in 2026 alongside other advanced memory solutions including 24Gb GDDR7 dies and 128GB+ DDR5 products
1
. The company aims to provide stable supply of its next-generation 2nm Gate-All-Around (GAA) production technology and HBM4 base die by 20264
.The strategic focus extends beyond HBM4, with Samsung planning to expand sales of high-value-added products including DDR5, LPDDR5x, and high-density QLC SSDs to meet growing AI application demands
1
. Additionally, the company will begin operations at its new fabrication facility in Taylor, Texas, as part of its capacity expansion strategy4
.Related Stories
Samsung's memory business has shown remarkable recovery, posting record Q3 2025 quarterly sales with a 15.4% revenue increase compared to the previous quarter
1
. The strong performance was driven by robust demand for HBM3E memory and server SSDs, reflecting the continuing wave of AI-driven market demand4
.Following the NVIDIA partnership announcement, Samsung's share price surged as much as 4.32% and has risen nearly 60% since July as investors anticipate the company's enhanced position in the AI memory market
5
. Analysts view the HBM4 launch as a crucial test of Samsung's ability to regain market leadership and potentially secure significant market share that it was unable to achieve with previous HBM series products.Summarized by
Navi
[1]
19 Sept 2025•Technology

22 Oct 2025•Technology

07 Aug 2024

1
Business and Economy

2
Technology

3
Policy and Regulation
