Nvidia Collaborates with Major Memory Makers on New SOCAMM Format for AI Servers

2 Sources

Nvidia partners with Samsung, SK Hynix, and Micron to develop SOCAMM, a new proprietary memory format for AI servers, offering higher performance and efficiency compared to traditional memory modules.

News article

Nvidia Introduces SOCAMM: A New Memory Standard for AI Servers

In a significant development for the AI hardware industry, Nvidia has partnered with major memory manufacturers Samsung, SK Hynix, and Micron to create a new proprietary memory format called SOCAMM (Small Outline Compression Attached Memory Module). This collaboration, unveiled at Nvidia GTC 2025, aims to enhance the performance and efficiency of AI servers 12.

SOCAMM: Technical Specifications and Advantages

SOCAMM is based on LPDDR5X technology and is designed specifically for Nvidia's Grace Blackwell platform. The new memory format offers several advantages over traditional memory modules like RDIMMs and MRDIMMs:

  1. Higher bandwidth: SOCAMM delivers more than 2.5 times the bandwidth of RDIMMs at the same capacity 2.
  2. Lower power consumption: It uses only one-third of the power compared to regular DDR5 DIMMs 2.
  3. Smaller footprint: The compact 14x90mm design occupies just one-third the size of standard RDIMM form factors 12.
  4. Improved efficiency: SOCAMM is optimized for efficient server layouts and thermal management 1.

Micron's SOCAMM Implementation

Micron has taken the lead in SOCAMM production, announcing that it will be the first to ship these products in volume. Key features of Micron's SOCAMM modules include:

  1. 128GB capacity: Achieved through 16-die stacks of LPDDR5X memory 2.
  2. Compatibility: Designed specifically for the Nvidia GB300 Grace Blackwell Ultra Superchip 12.
  3. AI optimization: Tailored for training large AI models and supporting more concurrent users on inference workloads 2.

Industry Impact and Future Prospects

The introduction of SOCAMM represents a significant shift in the AI hardware landscape:

  1. Exclusive to Nvidia: SOCAMM is specific to Nvidia's AI architecture and cannot be used in AMD or Intel systems 1.
  2. Market positioning: SK Hynix is positioning SOCAMM as a key offering for future AI infrastructure 1.
  3. Production plans: While Micron has already begun volume production, SK Hynix plans to start mass production "in line with the market's emergence" 1.

Expert Opinions

Industry leaders have expressed optimism about the potential of SOCAMM:

Raj Narasimhan, SVP at Micron, stated, "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications" 12.

Juseon Kim, President at SK Hynix, commented, "We are proud to present our line-up of industry-leading products at GTC 2025. With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward" 1.

As the AI industry continues to evolve rapidly, SOCAMM represents a significant advancement in memory technology, potentially reshaping the landscape of AI server architecture and performance.

Explore today's top stories

NVIDIA Unveils Major GeForce NOW Upgrade with RTX 5080 Performance and Expanded Game Library

NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.

CNET logoengadget logoPCWorld logo

10 Sources

Technology

21 hrs ago

NVIDIA Unveils Major GeForce NOW Upgrade with RTX 5080

Nvidia Develops New AI Chip for China Amid Geopolitical Tensions

Nvidia is reportedly developing a new AI chip, the B30A, based on its latest Blackwell architecture for the Chinese market. This chip is expected to outperform the currently allowed H20 model, raising questions about U.S. regulatory approval and the ongoing tech trade tensions between the U.S. and China.

TechCrunch logoTom's Hardware logoReuters logo

11 Sources

Technology

21 hrs ago

Nvidia Develops New AI Chip for China Amid Geopolitical

SoftBank's $2 Billion Investment in Intel: A Strategic Move in the AI Chip Race

SoftBank Group has agreed to invest $2 billion in Intel, buying common stock at $23 per share. This strategic investment comes as Intel undergoes a major restructuring under new CEO Lip-Bu Tan, aiming to regain its competitive edge in the semiconductor industry, particularly in AI chips.

TechCrunch logoTom's Hardware logoReuters logo

18 Sources

Business

13 hrs ago

SoftBank's $2 Billion Investment in Intel: A Strategic Move

Databricks Secures $100 Billion Valuation in Latest Funding Round, Highlighting AI Sector's Rapid Growth

Databricks, a data analytics firm, is set to raise its valuation to over $100 billion in a new funding round, showcasing the strong investor interest in AI startups. The company plans to use the funds for AI acquisitions and product development.

Reuters logoAnalytics India Magazine logoU.S. News & World Report logo

7 Sources

Business

5 hrs ago

Databricks Secures $100 Billion Valuation in Latest Funding

OpenAI Launches Affordable ChatGPT Go Plan in India, Eyeing Global Expansion

OpenAI introduces ChatGPT Go, a new subscription plan priced at ₹399 ($4.60) per month exclusively for Indian users, offering enhanced features and affordability to capture a larger market share.

TechCrunch logoBloomberg Business logoReuters logo

15 Sources

Technology

13 hrs ago

OpenAI Launches Affordable ChatGPT Go Plan in India, Eyeing
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo