Curated by THEOUTPOST
On Mon, 24 Mar, 8:01 AM UTC
2 Sources
[1]
Intel, AMD left out as Nvidia convinces Samsung, SK Hynix and Micron to develop new proprietary memory format for its own AI servers
SK Hynix plans production of SOCAMM as AI infrastructure demand grows At the recent Nvidia GTC 2025, memory makers Micron and SK Hynix took the wraps off their respective SOCAMM solutions. This new modular memory form factor is designed to unlock the full potential of AI platforms and has been developed exclusively for Nvidia's Grace Blackwell platform. SOCAMM, or Small Outline Compression Attached Memory Module, is based on LPDDR5X and intended to address growing performance and efficiency demands in AI servers. The form factor reportedly offers higher bandwidth, lower power consumption, and a smaller footprint compared to traditional memory modules such as RDIMMs and MRDIMMs. SOCAMM is specific to Nvidia's AI architecture and so can't be used in AMD or Intel systems. Micron announced it will be the first to ship SOCAMM products in volume and its 128GB SOCAMM modules are designed for the Nvidia GB300 Grace Blackwell Ultra Superchip. According to the company, the modules deliver more than 2.5 times the bandwidth of RDIMMs while using one-third the power. The compact 14x90mm design is intended to support efficient server layouts and thermal management. "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution," said Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit. "Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications." SK Hynix also presented its own low-power SOCAMM solution at GTC 2025 as part of a broader AI memory portfolio. Unlike Micron, the company didn't go into too much detail about it, but said it is positioning SOCAMM as a key offering for future AI infrastructure and plans to begin mass production "in line with the market's emergence". "We are proud to present our line-up of industry-leading products at GTC 2025," SK Hynix's President & Head of AI Infra Juseon (Justin) Kim said. "With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward."
[2]
Micron and NVIDIA collaborate on new modular LPDDR5X memory solution, for GB300 Blackwell Ultra
TL;DR: Micron's SOCAMM, developed with NVIDIA, is a modular LPDDR5X memory solution for the GB300 Grace Blackwell Superchip. Micron's new SOCAMM, a modular LPDDR5X memory solution, was developed in collaboration with NVIDIA to support its just-announced GB300 Grace Blackwell Superchip announced at GTC 2025 this week. Micron's new modular SOCAMM memory modules provide over 2.5x higher bandwidth at the same capacity as RDIMMs, and they're super-small, occupying just one-third the size of the industry-standard RDIMM form factor. Thanks to LPDDR5X, the new SOCAMM modules use one-third the power of regular DDR5 DIMMs, while SOCAMM placements of 16-die stacks of LPDDR5X memory enable a 128GB memory module: the highest-capacity LPDDR5X memory solution, which is perfect for training large AI models and more concurrent users on inference workloads. Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit, explained: "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications. HBM and LP memory solutions help unlock improved computational capabilities for GPUs". SOCAMM: a new standard for AI memory performance and efficiency: Micron's SOCAMM solution is now in volume production. The modular SOCAMM solution enables accelerated data processing, superior performance, unmatched power efficiency and enhanced serviceability to provide high-capacity memory for increasing AI workload requirements. Micron SOCAMM is the world's fastest, smallest, lowest-power and highest capacity modular memory solution,1 designed to meet the demands of AI servers and data-intensive applications. This new SOCAMM solution enables data centers to get the same compute capacity with better bandwidth, improved power consumption and scaling capabilities to provide infrastructure flexibility.
Share
Share
Copy Link
Nvidia partners with Samsung, SK Hynix, and Micron to develop SOCAMM, a new proprietary memory format for AI servers, offering higher performance and efficiency compared to traditional memory modules.
In a significant development for the AI hardware industry, Nvidia has partnered with major memory manufacturers Samsung, SK Hynix, and Micron to create a new proprietary memory format called SOCAMM (Small Outline Compression Attached Memory Module). This collaboration, unveiled at Nvidia GTC 2025, aims to enhance the performance and efficiency of AI servers 12.
SOCAMM is based on LPDDR5X technology and is designed specifically for Nvidia's Grace Blackwell platform. The new memory format offers several advantages over traditional memory modules like RDIMMs and MRDIMMs:
Micron has taken the lead in SOCAMM production, announcing that it will be the first to ship these products in volume. Key features of Micron's SOCAMM modules include:
The introduction of SOCAMM represents a significant shift in the AI hardware landscape:
Industry leaders have expressed optimism about the potential of SOCAMM:
Raj Narasimhan, SVP at Micron, stated, "AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the Nvidia Grace Blackwell platform yield performance and power-saving benefits for AI training and inference applications" 12.
Juseon Kim, President at SK Hynix, commented, "We are proud to present our line-up of industry-leading products at GTC 2025. With a differentiated competitiveness in the AI memory space, we are on track to bring our future as the Full Stack AI Memory Provider forward" 1.
As the AI industry continues to evolve rapidly, SOCAMM represents a significant advancement in memory technology, potentially reshaping the landscape of AI server architecture and performance.
Nvidia is reportedly working with SK Hynix, Samsung, and Micron to develop SOCAMM (System On Chip Advanced Memory Module), a new memory standard designed for high-performance AI applications. This compact and efficient module could revolutionize personal AI supercomputers.
2 Sources
2 Sources
SK hynix has begun sampling its groundbreaking 12-layer HBM4 memory, offering unprecedented capacity and bandwidth for AI acceleration. This development marks a significant leap in memory technology for AI applications.
5 Sources
5 Sources
SK Hynix strengthens its position in the AI chip market by advancing HBM4 production and introducing new HBM3E technology, responding to Nvidia's request for faster delivery amid growing competition with Samsung.
12 Sources
12 Sources
SK hynix is set to display its latest AI-focused memory technologies at CES 2025, featuring 16-layer HBM3E chips, high-capacity SSDs, and innovative solutions for on-device AI and data centers.
6 Sources
6 Sources
Micron Technology introduces new Crucial DDR5 CUDIMM and CSODIMM memory modules, offering speeds up to 6,400 MT/s, designed for AI PCs and high-performance systems. These modules promise faster performance and improved stability for data-intensive AI workloads.
3 Sources
3 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved