Nvidia Collaborates with Major Memory Makers on New SOCAMM Format for AI Servers

2 Sources

Nvidia partners with Samsung, SK Hynix, and Micron to develop SOCAMM, a new proprietary memory format for AI servers, offering higher performance and efficiency compared to traditional memory modules.

News article

Nvidia Introduces SOCAMM: A New Memory Standard for AI Servers

In a significant development for the AI hardware industry, Nvidia has partnered with major memory manufacturers Samsung, SK Hynix, and Micron to create a new proprietary memory format called SOCAMM (Small Outline Compression Attached Memory Module). This collaboration, unveiled at Nvidia GTC 2025, aims to enhance the performance and efficiency of AI servers 12.

SOCAMM: Technical Specifications and Advantages

SOCAMM is based on LPDDR5X technology and is designed specifically for Nvidia's Grace Blackwell platform. The new memory format offers several advantages over traditional memory modules like RDIMMs and MRDIMMs:

  1. Higher bandwidth: SOCAMM delivers more than 2.5 times the bandwidth of RDIMMs at the same capacity 2.
  2. Lower power consumption: It uses only one-third of the power compared to regular DDR5 DIMMs 2.
  3. Smaller footprint: The compact 14x90mm design occupies just one-third the size of standard RDIMM form factors 12.
  4. Improved efficiency: SOCAMM is optimized for efficient server layouts and thermal management 1.

Micron's SOCAMM Implementation

Micron has taken the lead in SOCAMM production, announcing that it will be the first to ship these products in volume. Key features of Micron's SOCAMM modules include:

  1. 128GB capacity: Achieved through 16-die stacks of LPDDR5X memory 2.
  2. Compatibility: Designed specifically for the Nvidia GB300 Grace Blackwell Ultra Superchip 12.
  3. AI optimization: Tailored for training large AI models and supporting more concurrent users on inference workloads 2.

Industry Impact and Future Prospects

The introduction of SOCAMM represents a significant shift in the AI hardware landscape:

  1. Exclusive to Nvidia: SOCAMM is specific to Nvidia's AI architecture and cannot be used in AMD or Intel systems 1.
  2. Market positioning: SK Hynix is positioning SOCAMM as a key offering for future AI infrastructure 1.
  3. Production plans: While Micron has already begun volume production, SK Hynix plans to start mass production "in line with the market's emergence" 1.

Expert Opinions

Industry leaders have expressed optimism about the potential of SOCAMM:

Raj Narasimhan, SVP at Micron, stated, "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications" 12.

Juseon Kim, President at SK Hynix, commented, "We are proud to present our line-up of industry-leading products at GTC 2025. With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward" 1.

As the AI industry continues to evolve rapidly, SOCAMM represents a significant advancement in memory technology, potentially reshaping the landscape of AI server architecture and performance.

Explore today's top stories

Apple Considers Partnering with OpenAI or Anthropic to Boost Siri's AI Capabilities

Apple is reportedly in talks with OpenAI and Anthropic to potentially use their AI models to power an updated version of Siri, marking a significant shift in the company's AI strategy.

TechCrunch logoThe Verge logoTom's Hardware logo

29 Sources

Technology

19 hrs ago

Apple Considers Partnering with OpenAI or Anthropic to

Cloudflare Launches Pay-Per-Crawl Feature to Monetize AI Bot Access

Cloudflare introduces a new tool allowing website owners to charge AI companies for content scraping, aiming to balance content creation and AI innovation.

Ars Technica logoTechCrunch logoMIT Technology Review logo

10 Sources

Technology

3 hrs ago

Cloudflare Launches Pay-Per-Crawl Feature to Monetize AI

Elon Musk's xAI Secures $10 Billion in Funding, Intensifying AI Competition

Elon Musk's AI company, xAI, has raised $10 billion in a combination of debt and equity financing, signaling a major expansion in AI infrastructure and development amid fierce industry competition.

TechCrunch logoReuters logoCNBC logo

5 Sources

Business and Economy

11 hrs ago

Elon Musk's xAI Secures $10 Billion in Funding,

Google Unveils Comprehensive AI Tools for Education with Gemini and NotebookLM

Google announces a major expansion of AI tools for education, including Gemini for Education and NotebookLM, aimed at enhancing learning experiences for students and supporting educators in classroom management.

TechCrunch logoThe Verge logoAndroid Police logo

8 Sources

Technology

19 hrs ago

Google Unveils Comprehensive AI Tools for Education with

NVIDIA's GB300 Blackwell Ultra AI Servers Set to Revolutionize AI Computing in Late 2025

NVIDIA's upcoming GB300 Blackwell Ultra AI servers, slated for release in the second half of 2025, are poised to become the most powerful AI servers globally. Major Taiwanese manufacturers are vying for production orders, with Foxconn securing the largest share.

TweakTown logoWccftech logo

2 Sources

Technology

11 hrs ago

NVIDIA's GB300 Blackwell Ultra AI Servers Set to
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Twitter logo
Instagram logo
LinkedIn logo