Nvidia Collaborates with Major Memory Makers on New SOCAMM Format for AI Servers

2 Sources

Share

Nvidia partners with Samsung, SK Hynix, and Micron to develop SOCAMM, a new proprietary memory format for AI servers, offering higher performance and efficiency compared to traditional memory modules.

News article

Nvidia Introduces SOCAMM: A New Memory Standard for AI Servers

In a significant development for the AI hardware industry, Nvidia has partnered with major memory manufacturers Samsung, SK Hynix, and Micron to create a new proprietary memory format called SOCAMM (Small Outline Compression Attached Memory Module). This collaboration, unveiled at Nvidia GTC 2025, aims to enhance the performance and efficiency of AI servers

1

2

.

SOCAMM: Technical Specifications and Advantages

SOCAMM is based on LPDDR5X technology and is designed specifically for Nvidia's Grace Blackwell platform. The new memory format offers several advantages over traditional memory modules like RDIMMs and MRDIMMs:

  1. Higher bandwidth: SOCAMM delivers more than 2.5 times the bandwidth of RDIMMs at the same capacity

    2

    .
  2. Lower power consumption: It uses only one-third of the power compared to regular DDR5 DIMMs

    2

    .
  3. Smaller footprint: The compact 14x90mm design occupies just one-third the size of standard RDIMM form factors

    1

    2

    .
  4. Improved efficiency: SOCAMM is optimized for efficient server layouts and thermal management

    1

    .

Micron's SOCAMM Implementation

Micron has taken the lead in SOCAMM production, announcing that it will be the first to ship these products in volume. Key features of Micron's SOCAMM modules include:

  1. 128GB capacity: Achieved through 16-die stacks of LPDDR5X memory

    2

    .
  2. Compatibility: Designed specifically for the Nvidia GB300 Grace Blackwell Ultra Superchip

    1

    2

    .
  3. AI optimization: Tailored for training large AI models and supporting more concurrent users on inference workloads

    2

    .

Industry Impact and Future Prospects

The introduction of SOCAMM represents a significant shift in the AI hardware landscape:

  1. Exclusive to Nvidia: SOCAMM is specific to Nvidia's AI architecture and cannot be used in AMD or Intel systems

    1

    .
  2. Market positioning: SK Hynix is positioning SOCAMM as a key offering for future AI infrastructure

    1

    .
  3. Production plans: While Micron has already begun volume production, SK Hynix plans to start mass production "in line with the market's emergence"

    1

    .

Expert Opinions

Industry leaders have expressed optimism about the potential of SOCAMM:

Raj Narasimhan, SVP at Micron, stated, "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications"

1

2

.

Juseon Kim, President at SK Hynix, commented, "We are proud to present our line-up of industry-leading products at GTC 2025. With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward"

1

.

As the AI industry continues to evolve rapidly, SOCAMM represents a significant advancement in memory technology, potentially reshaping the landscape of AI server architecture and performance.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo