2 Sources
[1]
Intel, AMD left out as Nvidia convinces Samsung, SK Hynix and Micron to develop new proprietary memory format for its own AI servers
SK Hynix plans production of SOCAMM as AI infrastructure demand grows At the recent Nvidia GTC 2025, memory makers Micron and SK Hynix took the wraps off their respective SOCAMM solutions. This new modular memory form factor is designed to unlock the full potential of AI platforms and has been developed exclusively for Nvidia's Grace Blackwell platform. SOCAMM, or Small Outline Compression Attached Memory Module, is based on LPDDR5X and intended to address growing performance and efficiency demands in AI servers. The form factor reportedly offers higher bandwidth, lower power consumption, and a smaller footprint compared to traditional memory modules such as RDIMMs and MRDIMMs. SOCAMM is specific to Nvidia's AI architecture and so can't be used in AMD or Intel systems. Micron announced it will be the first to ship SOCAMM products in volume and its 128GB SOCAMM modules are designed for the Nvidia GB300 Grace Blackwell Ultra Superchip. According to the company, the modules deliver more than 2.5 times the bandwidth of RDIMMs while using one-third the power. The compact 14x90mm design is intended to support efficient server layouts and thermal management. "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution," said Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit. "Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications." SK Hynix also presented its own low-power SOCAMM solution at GTC 2025 as part of a broader AI memory portfolio. Unlike Micron, the company didn't go into too much detail about it, but said it is positioning SOCAMM as a key offering for future AI infrastructure and plans to begin mass production "in line with the market's emergence". "We are proud to present our line-up of industry-leading products at GTC 2025," SK Hynix's President & Head of AI Infra Juseon (Justin) Kim said. "With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward."
[2]
Micron and NVIDIA collaborate on new modular LPDDR5X memory solution, for GB300 Blackwell Ultra
TL;DR: Micron's SOCAMM, developed with NVIDIA, is a modular LPDDR5X memory solution for the GB300 Grace Blackwell Superchip. Micron's new SOCAMM, a modular LPDDR5X memory solution, was developed in collaboration with NVIDIA to support its just-announced GB300 Grace Blackwell Superchip announced at GTC 2025 this week. Micron's new modular SOCAMM memory modules provide over 2.5x higher bandwidth at the same capacity as RDIMMs, and they're super-small, occupying just one-third the size of the industry-standard RDIMM form factor. Thanks to LPDDR5X, the new SOCAMM modules use one-third the power of regular DDR5 DIMMs, while SOCAMM placements of 16-die stacks of LPDDR5X memory enable a 128GB memory module: the highest-capacity LPDDR5X memory solution, which is perfect for training large AI models and more concurrent users on inference workloads. Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit, explained: "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications. HBM and LP memory solutions help unlock improved computational capabilities for GPUs". SOCAMM: a new standard for AI memory performance and efficiency: Micron's SOCAMM solution is now in volume production. The modular SOCAMM solution enables accelerated data processing, superior performance, unmatched power efficiency and enhanced serviceability to provide high-capacity memory for increasing AI workload requirements. Micron SOCAMM is the world's fastest, smallest, lowest-power and highest capacity modular memory solution,1 designed to meet the demands of AI servers and data-intensive applications. This new SOCAMM solution enables data centers to get the same compute capacity with better bandwidth, improved power consumption and scaling capabilities to provide infrastructure flexibility.
Share
Copy Link
Nvidia partners with Samsung, SK Hynix, and Micron to develop SOCAMM, a new proprietary memory format for AI servers, offering higher performance and efficiency compared to traditional memory modules.
In a significant development for the AI hardware industry, Nvidia has partnered with major memory manufacturers Samsung, SK Hynix, and Micron to create a new proprietary memory format called SOCAMM (Small Outline Compression Attached Memory Module). This collaboration, unveiled at Nvidia GTC 2025, aims to enhance the performance and efficiency of AI servers 12.
SOCAMM is based on LPDDR5X technology and is designed specifically for Nvidia's Grace Blackwell platform. The new memory format offers several advantages over traditional memory modules like RDIMMs and MRDIMMs:
Micron has taken the lead in SOCAMM production, announcing that it will be the first to ship these products in volume. Key features of Micron's SOCAMM modules include:
The introduction of SOCAMM represents a significant shift in the AI hardware landscape:
Industry leaders have expressed optimism about the potential of SOCAMM:
Raj Narasimhan, SVP at Micron, stated, "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications" 12.
Juseon Kim, President at SK Hynix, commented, "We are proud to present our line-up of industry-leading products at GTC 2025. With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward" 1.
As the AI industry continues to evolve rapidly, SOCAMM represents a significant advancement in memory technology, potentially reshaping the landscape of AI server architecture and performance.
NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.
10 Sources
Technology
21 hrs ago
10 Sources
Technology
21 hrs ago
Nvidia is reportedly developing a new AI chip, the B30A, based on its latest Blackwell architecture for the Chinese market. This chip is expected to outperform the currently allowed H20 model, raising questions about U.S. regulatory approval and the ongoing tech trade tensions between the U.S. and China.
11 Sources
Technology
21 hrs ago
11 Sources
Technology
21 hrs ago
SoftBank Group has agreed to invest $2 billion in Intel, buying common stock at $23 per share. This strategic investment comes as Intel undergoes a major restructuring under new CEO Lip-Bu Tan, aiming to regain its competitive edge in the semiconductor industry, particularly in AI chips.
18 Sources
Business
13 hrs ago
18 Sources
Business
13 hrs ago
Databricks, a data analytics firm, is set to raise its valuation to over $100 billion in a new funding round, showcasing the strong investor interest in AI startups. The company plans to use the funds for AI acquisitions and product development.
7 Sources
Business
5 hrs ago
7 Sources
Business
5 hrs ago
OpenAI introduces ChatGPT Go, a new subscription plan priced at ₹399 ($4.60) per month exclusively for Indian users, offering enhanced features and affordability to capture a larger market share.
15 Sources
Technology
13 hrs ago
15 Sources
Technology
13 hrs ago