DRAM prices skyrocket 80-90% as AI boom diverts memory supply, relief unlikely until 2028

3 Sources

Share

DRAM prices have surged 80-90% this quarter as demand from AI data centers diverts supply from PCs and consumer devices. Memory makers are projected to earn $551 billion in 2026, more than double what contract chip manufacturers will make. Despite new factories from SK Hynix, Samsung, and Micron, economists warn that relief won't come until 2028 at the earliest, with prices likely staying elevated even then.

DRAM Prices Surge Amid Unprecedented AI-Driven Demand

The tech industry faces a critical memory crunch as DRAM prices have surged 80-90% so far this quarter, driven by insatiable demand from AI data centers

1

. The shortage stems from a collision between the memory market's historic boom-and-bust cycles and an AI infrastructure buildout unprecedented in scale. While major AI hardware companies have secured their chips through 2028, PC makers, consumer electronics manufacturers, and other industries relying on computer memory find themselves scrambling to manage scarce supply and inflated prices

1

.

The dramatic price increase reflects how thoroughly the AI boom has reshaped the semiconductor industry. According to TrendForce, memory makers are projected to earn $551.6 billion in 2026, more than twice the $218.7 billion expected for contract chip manufacturers

2

. This disparity highlights how AI data centers create elevated demand for specific memory types, causing shortages across all categories and affecting prices industry-wide.

High-Bandwidth Memory Drives the Supply Crisis

At the heart of the supply and demand imbalance sits High-Bandwidth Memory (HBM), the DRAM industry's answer to Moore's Law limitations through 3D chip packaging technology

1

. Each HBM chip comprises up to 12 thinned-down DRAM dies stacked vertically and connected by microscopic solder balls, creating what's essentially a memory tower about 750 micrometers thick. These complex structures are then mounted within a millimeter of AI GPUs or AI accelerators, connected by as many as 2,048 micrometer-scale connections to break down the memory wall—the barrier in energy and time required to bring terabytes per second of data into processors running large language models

1

.

Source: IEEE

Source: IEEE

While HBM technology has existed for over a decade, its importance has exploded alongside AI model growth. SemiAnalysis estimates that HBM generally costs three times as much as other memory types and constitutes 50% or more of the cost of packaged AI GPUs

1

. Micron predicts the HBM market will reach $100 billion annually by 2028, roughly equivalent to the entire DRAM market in 2024

3

. With HBM4 memory devices using four times more silicon than typical DRAM chips, production capacity constraints become even more severe

2

.

How the Industry Reached This Breaking Point

The roots of today's memory shortage trace back to the COVID-19 pandemic, when hyperscalers like Amazon, Google, and Microsoft bought massive inventories to support remote work transitions, boosting prices

1

. But when supply normalized and data center expansion slowed in 2022, memory prices plummeted. The recession continued into 2023, forcing major players like Samsung to cut production by 50% to prevent prices from falling below manufacturing costs—a rare and desperate move in an industry where fabs must typically run at full capacity just to justify their value

1

.

Thomas Coughlin, president of Coughlin Associates and a storage and memory expert, explains that the memory market operates in a highly cyclical pattern. With new fabs costing $15 billion or more, firms hesitate to expand and may only have capital during boom periods

1

. Building such facilities takes 18 months or longer, virtually guaranteeing that new production capacity arrives well after demand surges, flooding markets and depressing prices. After the 2022-2023 downturn, memory companies remained wary of increasing production capacity, resulting in little to no investment in new facilities through most of 2024 and 2025

1

.

Source: PC Gamer

Source: PC Gamer

This lack of investment now collides with explosive demand from nearly 2,000 new data centers either planned or under construction globally

1

. McKinsey forecasts a staggering $7 trillion total investment in new data centers by 2030, with $5.2 trillion dedicated specifically to AI-focused facilities

3

.

Memory Makers Capitalize on Perfect Market Conditions

The memory market's commodity nature means prices react almost immediately to supply tightening, demand increases, or buyer sentiment shifts

2

. At DRAMeXchange, the spot price of a 16 Gb DDR5 chip averaged $38, with daily highs reaching $53 and lows at $25. Just one year ago, that same chip cost $4.75 on average, with session lows at $3.70 and highs at $6.60

2

. Similar price increases have affected 3D NAND memory in recent quarters.

TrendForce projects memory revenue growth accelerating dramatically: an 80% increase in 2024, followed by 46% growth in 2025, and a projected 134% surge in 2026

2

. By contrast, the foundry industry—operating under long-term agreements that smooth price fluctuations—saw 19% year-over-year revenue growth in 2024, followed by 25% in 2025 and a projected 25% in 2026.

Cloud service providers and AI companies prove less price-sensitive than traditional buyers, enabling memory makers to raise average selling prices more aggressively than in past cycles

2

. The AI boom creates demand not just for powerful training clusters with expensive HBM3E memory, but also for inference systems requiring CPUs, AI accelerators, memory, and storage—demand that shows no signs of declining.

New Factories Won't Bring Quick Relief

The three major players—Micron, SK Hynix, and Samsung—are all building new facilities, but production capacity won't meaningfully increase until 2028 at the earliest

3

. Micron plans to bring a new Singapore fab online in 2027 and is repurposing a Taiwan facility for production in the second half of this year, but both will produce HBM for Nvidia AI GPUs rather than DRAM for PCs

3

. Micron's new DRAM factory in New York State won't reach full production until 2030

3

.

SK Hynix is constructing a new HBM factory in Cheongju, China, scheduled for completion in late 2028, along with another HBM facility in Indiana slated for production in late 2028

3

. Samsung has a new fab in Pyeongtaek, South Korea, due online in 2028

3

. None of these companies have announced specific new DRAM factories beyond Micron's delayed New York facility.

Why Prices May Stay Elevated Even After Capacity Increases

Even when new facilities come online, economists remain skeptical that DRAM prices will normalize quickly. Mina Kim, an economist with Mkecon Insights, notes that "prices come down much more slowly and reluctantly than they go up. DRAM today is unlikely to be an exception to this general observation, especially given the insatiable demand for compute"

3

.

The supply chain dynamics favor sustained high prices. While new HBM facilities might add to overall production capacity, they don't directly address DRAM shortages for PCs and consumer electronics. The hope is that additional HBM capacity will relieve pressure on existing DRAM production lines and prevent their conversion to HBM facilities

3

. However, with major AI players like Meta, OpenAI, Nvidia, Amazon, and Microsoft accelerating their already massive AI investments in 2025, demand shows no signs of slowing

3

.

For industries outside AI, the outlook remains challenging. PC manufacturers, consumer electronics makers, and other sectors dependent on affordable computer memory face years of elevated costs and constrained supply. Barring a major collapse in the AI sector, the memory shortage will persist as the semiconductor industry struggles to balance production capacity between traditional DRAM and the lucrative HBM market driving record revenue for memory makers.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo