Rambus Unveils HBM4 Memory: A Leap in High-Bandwidth Computing

2 Sources

Rambus has announced details of its HBM4 memory controller, promising significant improvements in speed, bandwidth, and capacity. This new technology could revolutionize high-performance computing and AI applications.

News article

Rambus Introduces Next-Generation HBM4 Memory

Rambus, a pioneer in high-speed interface technology, has unveiled its latest innovation in the form of HBM4 (High Bandwidth Memory) 1. This next-generation memory technology promises to deliver unprecedented performance and capacity, potentially revolutionizing the landscape of high-performance computing and artificial intelligence applications.

Speed and Bandwidth Advancements

The HBM4 memory controller boasts impressive specifications, with speeds reaching up to 10 Gbps per pin. This represents a significant leap from its predecessor, HBM3, which operates at 6.4 Gbps 2. The increased speed translates to a remarkable bandwidth of 2.56 TB/s per stack, setting a new standard for data transfer rates in memory technologies.

Capacity Improvements

In addition to speed enhancements, HBM4 also addresses the growing demand for larger memory capacities. Each HBM4 stack can accommodate up to 64 GB of memory 1. This substantial increase in capacity per stack allows for more efficient memory allocation and management in complex computing tasks.

Potential Applications

The advancements brought by HBM4 are expected to have far-reaching implications across various sectors. High-performance computing, artificial intelligence, machine learning, and data analytics are among the fields that stand to benefit significantly from this technology 2. The increased bandwidth and capacity could enable more complex simulations, faster data processing, and improved AI model training.

Technical Specifications

Rambus has provided detailed technical specifications for the HBM4 memory controller. It features a 1024-bit wide interface and supports up to 16 independent channels 1. The controller is designed to work with both HBM3 and HBM4 DRAMs, ensuring backward compatibility and flexibility in implementation.

Industry Impact

The introduction of HBM4 is likely to spark a new wave of innovation in the semiconductor industry. As companies strive to incorporate this technology into their products, we may see a significant boost in the capabilities of next-generation GPUs, AI accelerators, and high-performance computing systems 2.

Timeline and Availability

While Rambus has announced the specifications for HBM4, the timeline for commercial availability remains unclear. The technology is still in its early stages, and it may take some time before we see HBM4-equipped devices in the market 1. However, the announcement has set the stage for future developments in high-bandwidth memory technologies.

Explore today's top stories

NVIDIA Unveils Major GeForce NOW Upgrade with RTX 5080 Performance and Expanded Game Library

NVIDIA announces significant upgrades to its GeForce NOW cloud gaming service, including RTX 5080-class performance, improved streaming quality, and an expanded game library, set to launch in September 2025.

CNET logoengadget logoPCWorld logo

10 Sources

Technology

22 hrs ago

NVIDIA Unveils Major GeForce NOW Upgrade with RTX 5080

Nvidia Develops New AI Chip for China Amid Geopolitical Tensions

Nvidia is reportedly developing a new AI chip, the B30A, based on its latest Blackwell architecture for the Chinese market. This chip is expected to outperform the currently allowed H20 model, raising questions about U.S. regulatory approval and the ongoing tech trade tensions between the U.S. and China.

TechCrunch logoTom's Hardware logoReuters logo

11 Sources

Technology

22 hrs ago

Nvidia Develops New AI Chip for China Amid Geopolitical

SoftBank's $2 Billion Investment in Intel: A Strategic Move in the AI Chip Race

SoftBank Group has agreed to invest $2 billion in Intel, buying common stock at $23 per share. This strategic investment comes as Intel undergoes a major restructuring under new CEO Lip-Bu Tan, aiming to regain its competitive edge in the semiconductor industry, particularly in AI chips.

TechCrunch logoTom's Hardware logoReuters logo

18 Sources

Business

14 hrs ago

SoftBank's $2 Billion Investment in Intel: A Strategic Move

Databricks Secures $100 Billion Valuation in Latest Funding Round, Highlighting AI Sector's Rapid Growth

Databricks, a data analytics firm, is set to raise its valuation to over $100 billion in a new funding round, showcasing the strong investor interest in AI startups. The company plans to use the funds for AI acquisitions and product development.

Reuters logoAnalytics India Magazine logoU.S. News & World Report logo

7 Sources

Business

6 hrs ago

Databricks Secures $100 Billion Valuation in Latest Funding

OpenAI Launches Affordable ChatGPT Go Plan in India, Eyeing Global Expansion

OpenAI introduces ChatGPT Go, a new subscription plan priced at ₹399 ($4.60) per month exclusively for Indian users, offering enhanced features and affordability to capture a larger market share.

TechCrunch logoBloomberg Business logoReuters logo

15 Sources

Technology

14 hrs ago

OpenAI Launches Affordable ChatGPT Go Plan in India, Eyeing
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo