Samsung and SK hynix Unveil Next-Gen HBM4 Memory, Intensifying AI Chip Competition

2 Sources

Share

Samsung and SK hynix showcase their latest HBM4 memory chips at the 2025 Semiconductor Exhibition (SEDEX) in Seoul, signaling an impending competition in the AI chip market. The event highlights the growing importance of high-bandwidth memory in AI acceleration.

News article

South Korean Tech Giants Unveil Next-Generation Memory for AI

Samsung Electronics and SK hynix, two of South Korea's leading semiconductor manufacturers, have showcased their latest high bandwidth memory (HBM) chips at the 2025 Semiconductor Exhibition (SEDEX) in Seoul

2

. This event marks a significant milestone in the race for dominance in the artificial intelligence (AI) chip market, with both companies displaying their HBM4 modules to the public for the first time

1

.

The Importance of HBM in AI Acceleration

High bandwidth memory is a crucial component in graphics processing units (GPUs) that power generative AI systems. As the demand for more sophisticated AI applications grows, the need for faster and more efficient memory solutions becomes paramount. The current market is dominated by fifth-generation HBM3E chips, but industry experts anticipate HBM4 to become a major factor in the coming year

2

.

Samsung's Comeback Strategy

Samsung, once the dominant player in the memory market, has faced challenges in recent years, particularly in the HBM segment. However, the company is making a strong comeback with its HBM4 technology. At SEDEX, Samsung demonstrated its readiness for mass production, reporting an impressive 90% yield for its HBM4 logic die

1

.

To ensure early adoption of its HBM4 modules, Samsung is implementing several strategies:

  1. Maintaining competitive pricing
  2. Offering higher production capacities
  3. Delivering faster pin speeds, reportedly around 11 Gbps

SK hynix's Market Leadership

Currently, SK hynix leads the HBM market with a 62% share in shipments, followed by Micron Technology and Samsung

2

. The company has already completed the development of its HBM4 chips and is preparing for mass production. SK hynix's strong position is partly due to its involvement in the tripartite supply chain with Nvidia and TSMC for the current HBM3E chips.

The Race for Nvidia's Next-Gen AI Accelerator

Both Samsung and SK hynix are vying for a position in Nvidia's supply chain for its upcoming AI accelerator, codenamed Rubin. This next-generation GPU is expected to utilize HBM4 technology, making it a crucial battleground for memory manufacturers

2

. While SK hynix is reportedly in talks with Nvidia for large-scale supply, Samsung has yet to receive approval but remains optimistic given its recent technological advancements

1

.

Implications for the DRAM Market

The intensifying competition in the HBM4 segment is expected to have significant implications for the broader DRAM market. With Samsung's rapid advancements and the unprecedented demand for high-performance memory, the industry is poised for a more competitive landscape in the coming years

1

.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo