Nvidia Collaborates with Major Memory Makers on New SOCAMM Format for AI Servers

Curated by THEOUTPOST

On Mon, 24 Mar, 8:01 AM UTC

2 Sources

Share

Nvidia partners with Samsung, SK Hynix, and Micron to develop SOCAMM, a new proprietary memory format for AI servers, offering higher performance and efficiency compared to traditional memory modules.

Nvidia Introduces SOCAMM: A New Memory Standard for AI Servers

In a significant development for the AI hardware industry, Nvidia has partnered with major memory manufacturers Samsung, SK Hynix, and Micron to create a new proprietary memory format called SOCAMM (Small Outline Compression Attached Memory Module). This collaboration, unveiled at Nvidia GTC 2025, aims to enhance the performance and efficiency of AI servers 12.

SOCAMM: Technical Specifications and Advantages

SOCAMM is based on LPDDR5X technology and is designed specifically for Nvidia's Grace Blackwell platform. The new memory format offers several advantages over traditional memory modules like RDIMMs and MRDIMMs:

  1. Higher bandwidth: SOCAMM delivers more than 2.5 times the bandwidth of RDIMMs at the same capacity 2.
  2. Lower power consumption: It uses only one-third of the power compared to regular DDR5 DIMMs 2.
  3. Smaller footprint: The compact 14x90mm design occupies just one-third the size of standard RDIMM form factors 12.
  4. Improved efficiency: SOCAMM is optimized for efficient server layouts and thermal management 1.

Micron's SOCAMM Implementation

Micron has taken the lead in SOCAMM production, announcing that it will be the first to ship these products in volume. Key features of Micron's SOCAMM modules include:

  1. 128GB capacity: Achieved through 16-die stacks of LPDDR5X memory 2.
  2. Compatibility: Designed specifically for the Nvidia GB300 Grace Blackwell Ultra Superchip 12.
  3. AI optimization: Tailored for training large AI models and supporting more concurrent users on inference workloads 2.

Industry Impact and Future Prospects

The introduction of SOCAMM represents a significant shift in the AI hardware landscape:

  1. Exclusive to Nvidia: SOCAMM is specific to Nvidia's AI architecture and cannot be used in AMD or Intel systems 1.
  2. Market positioning: SK Hynix is positioning SOCAMM as a key offering for future AI infrastructure 1.
  3. Production plans: While Micron has already begun volume production, SK Hynix plans to start mass production "in line with the market's emergence" 1.

Expert Opinions

Industry leaders have expressed optimism about the potential of SOCAMM:

Raj Narasimhan, SVP at Micron, stated, "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications" 12.

Juseon Kim, President at SK Hynix, commented, "We are proud to present our line-up of industry-leading products at GTC 2025. With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward" 1.

As the AI industry continues to evolve rapidly, SOCAMM represents a significant advancement in memory technology, potentially reshaping the landscape of AI server architecture and performance.

Continue Reading
Nvidia and Memory Giants Collaborate on SOCAMM: A New

Nvidia and Memory Giants Collaborate on SOCAMM: A New Standard for AI-Focused Memory Modules

Nvidia is reportedly working with SK Hynix, Samsung, and Micron to develop SOCAMM (System On Chip Advanced Memory Module), a new memory standard designed for high-performance AI applications. This compact and efficient module could revolutionize personal AI supercomputers.

Tom's Hardware logoWccftech logo

2 Sources

Tom's Hardware logoWccftech logo

2 Sources

SK hynix Leads the Charge in Next-Gen AI Memory with

SK hynix Leads the Charge in Next-Gen AI Memory with World's First 12-Layer HBM4 Samples

SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.

TechSpot logoTweakTown logoWccftech logoThe Korea Times logo

5 Sources

TechSpot logoTweakTown logoWccftech logoThe Korea Times logo

5 Sources

SK Hynix Accelerates HBM4 Development to Meet Nvidia's

SK Hynix Accelerates HBM4 Development to Meet Nvidia's Demand, Unveils 16-Layer HBM3E

SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.

DIGITIMES logotheregister.com logoFortune logoCCN.com logo

12 Sources

DIGITIMES logotheregister.com logoFortune logoCCN.com logo

12 Sources

SK hynix Showcases Advanced AI Memory Solutions at CES

SK hynix Showcases Advanced AI Memory Solutions at CES 2025, Including 16-layer HBM3E and 122TB SSD

SK hynix is set to display its latest AI-focused memory technologies at CES 2025, featuring 16-layer HBM3E chips, high-capacity SSDs, and innovative solutions for on-device AI and data centers.

Tom's Hardware logoTweakTown logoThe Korea Times logoInvesting.com UK logo

6 Sources

Tom's Hardware logoTweakTown logoThe Korea Times logoInvesting.com UK logo

6 Sources

Micron Unveils Ultra-Fast DDR5 Memory Modules for AI PCs,

Micron Unveils Ultra-Fast DDR5 Memory Modules for AI PCs, Doubling Speed of DDR4

Micron Technology introduces new Crucial DDR5 CUDIMM and CSODIMM memory modules, offering speeds up to 6,400 MT/s, designed for AI PCs and high-performance systems. These modules promise faster performance and improved stability for data-intensive AI workloads.

TechRadar logoTweakTown logoInvesting.com UK logo

3 Sources

TechRadar logoTweakTown logoInvesting.com UK logo

3 Sources

TheOutpost.ai

Your one-stop AI hub

The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.

© 2025 TheOutpost.AI All rights reserved