Samsung Develops Custom HBM4 Solutions for Meta and Microsoft, Aiming to Compete with TSMC and SK hynix

Curated by THEOUTPOST

On Fri, 15 Nov, 8:01 AM UTC

2 Sources

Share

Samsung initiates development of customized HBM4 memory solutions for tech giants Meta and Microsoft, potentially marking a significant breakthrough in the company's HBM business and intensifying competition in the AI chip market.

Samsung's Strategic Move in HBM4 Development

Samsung has reportedly begun developing "custom HBM4" memory solutions, specifically tailored for tech giants Meta and Microsoft. This move signals a potential breakthrough for Samsung's High Bandwidth Memory (HBM) business, which has been facing challenges in recent times 1.

Technological Advancements and Specifications

The upcoming HBM4 memory from Samsung is expected to offer significant improvements over its predecessor, HBM3E. Key enhancements include:

  • Memory bandwidth increase of 66%, reaching 2TB/sec
  • Capacity boost of 33%, up to 48GB
  • Stacking of 16 DRAM stages 1

Samsung plans to utilize its own 4nm process from its foundry division and employ 10nm 6th-generation 1c DRAM, considered one of the highest-end offerings in the market 2.

Potential Applications and Market Impact

The customized HBM4 solutions are rumored to be integrated into:

  • Microsoft's "Maia 100" AI chip
  • Meta's Artemis AI processor 2

This development could prove crucial for Samsung, as major tech companies increasingly shift towards custom-built AI chips for their data centers and supercomputers. The move aims to reduce chip purchases and cut costs for these tech giants 1.

Competitive Landscape

Samsung's initiative comes as a response to fierce competition in the HBM market:

  • TSMC and SK hynix have partnered for HBM4 memory development, particularly for NVIDIA's next-gen Rubin R100 AI GPU
  • Samsung's recent struggles in securing NVIDIA as a customer have led to bearish earnings in its HBM segment 2

Timeline and Future Prospects

Samsung is expected to complete HBM4 development by the end of 2025, with mass production commencing shortly after. While the company unveiled initial HBM4 specifications at the ISSCC 2024 conference in February, detailed specs are yet to be released 1.

This strategic move could potentially revitalize Samsung's position in the HBM market. However, the company will need to overcome its recent setbacks and face strong competition from established players like SK hynix to secure a significant market share in the rapidly evolving AI chip industry 2.

Continue Reading
SK hynix Leads the Charge in Next-Gen AI Memory with

SK hynix Leads the Charge in Next-Gen AI Memory with World's First 12-Layer HBM4 Samples

SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.

TechSpot logoTweakTown logoWccftech logoThe Korea Times logo

5 Sources

TechSpot logoTweakTown logoWccftech logoThe Korea Times logo

5 Sources

SK Hynix Accelerates HBM4 Development to Meet Nvidia's

SK Hynix Accelerates HBM4 Development to Meet Nvidia's Demand, Unveils 16-Layer HBM3E

SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.

DIGITIMES logotheregister.com logoFortune logoCCN.com logo

12 Sources

DIGITIMES logotheregister.com logoFortune logoCCN.com logo

12 Sources

Next-Gen HBM Memory Race Heats Up: SK Hynix and Micron

Next-Gen HBM Memory Race Heats Up: SK Hynix and Micron Prepare for HBM3E and HBM4 Production

SK Hynix and Micron are gearing up for the production of next-generation High Bandwidth Memory (HBM) technologies, with SK Hynix focusing on HBM3E for 2025 and Micron targeting HBM4 for 2026, driven by increasing demand in AI GPU components.

TweakTown logoWccftech logo

3 Sources

TweakTown logoWccftech logo

3 Sources

Samsung's 8-Layer HBM3E Chips Pass Nvidia's Tests, Paving

Samsung's 8-Layer HBM3E Chips Pass Nvidia's Tests, Paving Way for AI Advancements

Samsung Electronics has successfully cleared Nvidia's tests for its 8-layer High Bandwidth Memory 3E (HBM3E) chips. This breakthrough could lead to significant advancements in AI chip technology and strengthen Samsung's position in the memory chip market.

Economic Times logoMarket Screener logoCNBC logoThePrint logo

4 Sources

Economic Times logoMarket Screener logoCNBC logoThePrint logo

4 Sources

NVIDIA Accelerates Efforts to Certify Samsung's HBM3E

NVIDIA Accelerates Efforts to Certify Samsung's HBM3E Memory for AI GPUs

NVIDIA is working rapidly to certify Samsung's HBM3E memory chips for its AI GPUs, potentially diversifying its supply chain beyond current major suppliers SK hynix and Micron.

TweakTown logoWccftech logo

2 Sources

TweakTown logoWccftech logo

2 Sources

TheOutpost.ai

Your one-stop AI hub

The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.

© 2025 TheOutpost.AI All rights reserved