3 Sources
3 Sources
[1]
How Will the Tech Industry Handle the DRAM Shortage?
SK Hynix's 12-layer HBM4 memory chips on display at the SK AI Summit in Seoul, South Korea, on 3 November 2025. If it feels these days as if everything in technology is about AI, that's because it is. And nowhere is that more true than in the market for computer memory. Demand, and profitability, for the type of DRAM used to feed GPUs and other accelerators in AI data centers is so huge that it's diverting away supply of memory for other uses and causing prices to skyrocket. According to Counterpoint Research, DRAM prices have risen 80-90 precent so far this quarter. The largest AI hardware companies say they have secured their chips out as far as 2028, but that leaves everybody else -- makers of PCs, consumer gizmos, and everything else that needs to temporarily store a billion bits -- scrambling to deal with scarce supply and inflated prices. How did the electronics industry get into this mess, and more importantly, how will it get out? IEEE Spectrum asked economists and memory experts to explain. They say today's situation is the result of a collision between the DRAM industry's historic boom and bust cycle and an AI hardware infrastructure build-out that's without precedent in its scale. And, barring some major collapse in the AI sector, it will take years for new capacity and new technology to bring supply in line with demand. Prices might stay high even then. To understand both ends of the tale, you need to know the main culprit in the supply and demand swing, high-bandwidth memory, or HBM. HBM is the DRAM industry's attempt to short-circuit the slowing pace of Moore's Law by using 3D chip packaging technology. Each HBM chip is made up of as many as 12 thinned-down DRAM chips called dies. Each die contains a number of vertical connections called through silicon vias (TSVs). The dies are piled atop each other and connected by arrays of microscopic solder balls aligned to the TSVs. This DRAM tower -- well, at about 750 micrometers thick, it's more of a brutalist office-block than a tower -- is then stacked atop what's called the base die, which shuttles bits between the memory dies and the processor. This complex piece of technology is then set within a millimeter of a GPU or other AI accelerator, to which it is linked by as many as 2,048 micrometer-scale connections. HBMs are attached on two sides of the processor, and the GPU and memory are packaged together as a single unit. The idea behind such a tight, highly-connected squeeze with the GPU is to knock down what's called the memory wall. That's the barrier in energy and time of bringing the terabytes per second of data needed to run large language models into the GPU. Memory bandwidth is a key limiter to how fast LLMs can run. As a technology, HBM has been around for more than 10 years, and DRAM makers have been busy boosting its capability. As the size of AI models has grown, so has HBM's importance to the GPU. But that's come at a cost. SemiAnalysis estimates that HBM generally costs three times as much as other types of memory and constitutes 50 percent or more of the cost of the packaged GPU. Memory and storage industry watchers agree that DRAM is a highly cyclical industry with huge booms and devastating busts. With new fabs costing US $15 billion or more, firms are extremely reluctant to expand and may only have the cash to do so during boom times, explains Thomas Coughlin, a storage and memory expert and president of Coughlin Associates. But building such a fab and getting it up and running can take 18 months or more, practically ensuring that new capacity arrives well past the initial surge in demand, flooding the market and depressing prices. The origins of today's cycle, says Coughlin, go all the way back to the chip supply panic surrounding the COVID-19 pandemic . To avoid supply-chain stumbles and support the rapid shift to remote work, hyperscalers -- data center giants like Amazon, Google, and Microsoft -- bought up huge inventories of memory and storage, boosting prices, he notes. But then supply became more regular and data center expansion fell off in 2022, causing memory and storage prices to plummet. This recession continued into 2023, and even resulted in big memory and storage companies such as Samsung cutting production by 50 percent to try and keep prices from going below the costs of manufacturing, says Coughlin. It was a rare and fairly desperate move, because companies typically have to run plants at full capacity just to earn back their value. After a recovery began in late 2023, "all the memory and storage companies were very wary of increasing their production capacity again," says Coughlin. "Thus there was little or no investment in new production capacity in 2024 and through most of 2025." That lack of new investment is colliding headlong with a huge boost in demand from new data centers. Globally, there are nearly 2,000 new data centers either planned or under construction right now, according to Data Center Map. If they're all built, it would represent a 20 percent jump in the global supply, which stands at around 9,000 facilities now. If the current build-out continues at pace, McKinsey predicts companies will spend $7 trillion by 2030, with the bulk of that -- $5.2 trillion -- going to AI-focused data centers. Of that chunk, $3.3 billion will go toward servers, data storage, and network equipment, the firm predicts. The biggest beneficiary so far of the AI data center boom is unquestionably GPU-maker Nvidia. Revenue for its data center business went from barely a billion in the final quarter of 2019 to $51 billion in the quarter that ended in October 2025. Over this period, its server GPUs have demanded not just more and more gigabytes of DRAM but an increasing number of DRAM chips. The recently released B300 uses eight HBM chips, each of which is a stack of 12 DRAM dies. Competitors' use of HBM has largely mirrored Nvidia's. AMD's MI350 GPU, for example, also uses eight, 12-die chips. With so much demand, an increasing fraction of the revenue for DRAM makers comes from HBM. Micron -- the number three producer behind SK Hynix and Samsung -- reported that HBM and other cloud-related memory went from being 17 percent of its DRAM revenue in 2023 to nearly 50 percent in 2025. Micron predicts the total market for HBM will grow from $35 billion in 2025 to $100 billion by 2028 -- a figure larger than the entire DRAM market in 2024, CEO Sanjay Mehrotra told analysts in December. It's reaching that figure two years earlier than Micron had previously expected. Across the industry, demand will outstrip supply "substantially... for the foreseeable future," he said. "There are two ways to address supply issues with DRAM: with innovation or with building more fabs," explains Mina Kim, an economist with the Mkecon Insights. "As DRAM scaling has become more difficult, the industry has turned to advanced packaging... which is just using more DRAM." Micron, Samsung, and SK Hynix combined make up the vast majority of the memory and storage markets, and all three have new fabs and facilities in the works. However, these are unlikely to contribute meaningfully to bringing down prices. With these expansions unable to contribute for several years, other factors will be needed to increase supply. "Relief will come from a combination of incremental capacity expansions by existing DRAM leaders, yield improvements in advanced packaging, and a broader diversification of supply chains," says Shawn DuBravac , chief economist for the Global Electronics Association (formerly the IPC). "New fabs will help at the margin, but the faster gains will come from process learning, better [DRAM] stacking efficiency, and tighter coordination between memory suppliers and AI chip designers." So, will prices come down once some of these new plants come on line? Don't bet on it. "In general, economists find that prices come down much more slowly and reluctantly than they go up. DRAM today is unlikely to be an exception to this general observation, especially given the insatiable demand for compute," says Kim. In the meantime, technologies are in the works that could make HBM an even bigger consumer of silicon. The standard for HBM4 can accommodate 16 stacked DRAM dies, even though today's chips only use 12 dies. Getting to 16 has a lot to do with the chip stacking technology. Conducting heat through the HBM "layer cake" of silicon, solder, and support material is a key limiter to going higher and in repositioning HBM inside the package to get even more bandwidth. SK Hynix claims a heat conduction advantage through a manufacturing process called advanced MR-MUF (mass reflow molded underfill). Further out, an alternative chip stacking technology called hybrid bonding could help heat conduction by reducing the die-to-die vertical distance essentially to zero. In 2024, researchers at Samsung proved they could produce a 16-high stack with hybrid bonding, and they suggested that 20 dies was not out of reach.
[2]
Memory makers are set to earn $551 billion from the AI boom, twice as much as contract chip manufacturers -- forecasts suggest that 2026 revenue will skyrocket thanks to data center demand
The artificial intelligence supercycle is reshaping the semiconductor and electronics industries, as the scale of the AI infrastructure buildout strains the entire supply chain. While developers of AI accelerators like Nvidia are cashing in on the AI boom, it's memory makers that will earn the most cash, according to estimates from TrendForce. Arguably, this is a result of the different business models and expansion strategies memory makers use compared to foundries, in addition to the behavior of the commodity market. The company projects that while global foundry revenue is expected to total $218.7 billion, 3D NAND and DRAM revenue will reach $551.6 billion, which means that the total market for memory is more than twice as large as contract chip production. TrendForce attributes this to structural market changes caused by AI buildouts. The latter creates elevated demand for specific types of memory, creating shortages of all types of memory, and therefore affecting prices across the industry. As a consequence, while the AI industry does not need low-capacity commodity memory devices, they also become substantially more expensive amid tight supply. This creates the perfect conditions for memory makers to capitalize upon. Indeed, the spot price of a 16 Gb DDR5 chip at DRAMeXchange was $38 on average, with a daily high of $53 and a daily low of $25. By contrast, the very same chip used to cost $4.75 on average just one year ago ($3.70 session low, $6.60 session high). Similar changes occurred to the prices of 3D NAND memory in recent quarters. Just like some other analyst firms, TrendForce calls the AI megatrend a 'supercycle,' indicating its overwhelming ubiquity, which affects multiple industries, and its potential length. There were two periods in the last few decades when revenue of memory makers grew significantly year-over-year for two years in a row: in 2017 - 2018, when hyperscalers built their vast data centers (+62% in 2017 and +27% in 2018), and in 2020 - 2021, when people increased purchases of PCs amid the COVID-19 pandemic. In both cases, memory makers increased capacity to meet demand and maintain their market share, which caused sharp drops in revenue back in 2019 and 2022. The foundry industry -- which is much more capital-intensive than the 3D NAND or DRAM industries -- uses fabs that are harder and longer to build, and only suffered a year-over-year revenue decline in 2023. The situation today is vastly different. On the one hand, leading developers of frontier AI models need the most powerful clusters to train their models, therefore creating demand for leading-edge hardware with expensive HBM3E memory and plenty of storage. On the other hand, these companies and their clients need more powerful inference systems to use those models. Therefore, demand for CPUs, AI accelerators, memory, and storage does not decline over time. Meanwhile, buyers like cloud service providers (CSPs) tend to be less sensitive to price increases, which is why 3D NAND and DRAM suppliers are expected to raise average selling prices more aggressively than in the past cycles. 3D NAND and DRAM are commodities, so their prices behave like prices of commodities, almost immediately reacting to tightening supply, increasing demand, or sentiment among buyers. While large PC makers purchase their memory at prices agreed upon every six months, a significant portion of memory is sold on the spot market. This dynamic is reflected in TrendForce's projections that show memory revenue growth accelerating after the downturn of 2022 - 2023, including an expected 80% increase in 2024, followed by 46% growth in 2025, and a projected 134% surge in 2026. By contrast, foundries tend to operate under long-term agreements that smooth price fluctuations, which prevents sharp swings that characterize memory markets. Even during periods of strong demand, foundry pricing adjustments occur gradually, which means slower revenue growth compared to memory vendors. TrendForce models that following a 19% year-over-year revenue increase in 2024, the foundry market grew 25% in 2025 and will grow another 25% this year. As a result, boosted by the AI supercycle and not constrained by long-term agreements, memory vendors will earn more than two times more this year alone compared to producers of logic, which have to adhere to their long-term contracts. With HBM4 memory devices using four times more silicon than typical DRAM ICs, it is obvious that memory makers cannot meet all the demand that exists because of insufficient capacity, which results in price adjustments. However, the biggest question is how significantly current commodity 3D NAND and DRAM prices are influenced by insufficient supply, and how significantly they are influenced by typical commodity memory market behavior that dictates that customers buy more memory when it is getting more expensive, as it may get even more expensive in the future?
[3]
The three big players in computer memory are all building new factories but it probably won't help DRAM prices until 2028, if then
Micron, SK Hynix and Samsung are all tooling up, but don't expect PC memory prices to improve for years to come. The three biggest players in computer memory are all tooling up major new factories, but the main impact of that won't come until 2028 and even then it may not be enough to normalise spiralling memory prices. This analysis comes from IEEE Spectrum, which has published an overview of the current memory market and insights into how it may all play out. The publication notes that Micron is likely to be the first to get new facilities online, with a new fab in Singapore that should go into production in 2027. It's also repurposing another facility in Taiwan with plans for production in the second half of next year. The slight snag is that those facilities will be producing HBM memory as used by Nvidia's AI GPUs, not DRAM for PCs. Micron does actually have plans for a new DRAM factory in New York State, but IEEE Spectrum says that won't be in full production until 2030. Yikes. SK Hynix, meanwhile, is building a new HBM factory in Cheongju, China which should be "complete" in late 2028, and another HBM facility in Indiana slated for production in late 2028. Finally, Samsung has a new fab in in Pyeongtaek, South Korea that's due to come online in 2028. There's no specific mention of new DRAM factories from SK Hynix or Samsung. Now, you could argue that these new facilities add to overall production capacity whatever type of memory they make. So, that's a good thing for DRAM and thus PC memory prices even if they're not expressly being erected for that purpose. Perhaps it will take the pressure off existing DRAM production lines, which after all were enough to serve the PC market before the AI boom came along, and prevent them from being converted into HBM facilities. However, IEEE Spectrum notes that Economists aren't convinced that these new facilities will correct memory prices immediately. "In general, economists find that prices come down much more slowly and reluctantly than they go up. DRAM today is unlikely to be an exception to this general observation, especially given the insatiable demand for compute," it quotes Mina Kim saying, an economist with the Mkecon Insights. Underlying all this, of course, is the aforementioned AI boom. IEEE Spectrum notes a forecast from consultants McKinsey that predicts a staggering $7 trillion total investment in new data centres by 2030, $5.2 trillion of which will be blown on AI-focused data centres. With that kind of money being tossed around, it's no wonder memory prices aren't expected to go back to normal. Indeed, Micron predicts that the market for HBM, which until recently was a fairly niche technology, will grow to $100 billion annually in 2028, or about the same as the entire DRAM market in 2024. All told, this squares pretty well with our recent observations around the plans of big AI players like Meta, OpenAI, Nvidia, Amazon and Microsoft. Long story short, however bananas their investments in AI have been up to this point, that is only accelerating this year and on an epic scale. It may be a familiar refrain then, but it bears repeating and really absorbing. It really does look like this is all going to get substantially worse before it gets better. And it's likely going to take years to resolve. Sorry.
Share
Share
Copy Link
DRAM prices have surged 80-90% this quarter as demand from AI data centers diverts supply from PCs and consumer devices. Memory makers are projected to earn $551 billion in 2026, more than double what contract chip manufacturers will make. Despite new factories from SK Hynix, Samsung, and Micron, economists warn that relief won't come until 2028 at the earliest, with prices likely staying elevated even then.
The tech industry faces a critical memory crunch as DRAM prices have surged 80-90% so far this quarter, driven by insatiable demand from AI data centers
1
. The shortage stems from a collision between the memory market's historic boom-and-bust cycles and an AI infrastructure buildout unprecedented in scale. While major AI hardware companies have secured their chips through 2028, PC makers, consumer electronics manufacturers, and other industries relying on computer memory find themselves scrambling to manage scarce supply and inflated prices1
.The dramatic price increase reflects how thoroughly the AI boom has reshaped the semiconductor industry. According to TrendForce, memory makers are projected to earn $551.6 billion in 2026, more than twice the $218.7 billion expected for contract chip manufacturers
2
. This disparity highlights how AI data centers create elevated demand for specific memory types, causing shortages across all categories and affecting prices industry-wide.At the heart of the supply and demand imbalance sits High-Bandwidth Memory (HBM), the DRAM industry's answer to Moore's Law limitations through 3D chip packaging technology
1
. Each HBM chip comprises up to 12 thinned-down DRAM dies stacked vertically and connected by microscopic solder balls, creating what's essentially a memory tower about 750 micrometers thick. These complex structures are then mounted within a millimeter of AI GPUs or AI accelerators, connected by as many as 2,048 micrometer-scale connections to break down the memory wall—the barrier in energy and time required to bring terabytes per second of data into processors running large language models1
.
Source: IEEE
While HBM technology has existed for over a decade, its importance has exploded alongside AI model growth. SemiAnalysis estimates that HBM generally costs three times as much as other memory types and constitutes 50% or more of the cost of packaged AI GPUs
1
. Micron predicts the HBM market will reach $100 billion annually by 2028, roughly equivalent to the entire DRAM market in 20243
. With HBM4 memory devices using four times more silicon than typical DRAM chips, production capacity constraints become even more severe2
.The roots of today's memory shortage trace back to the COVID-19 pandemic, when hyperscalers like Amazon, Google, and Microsoft bought massive inventories to support remote work transitions, boosting prices
1
. But when supply normalized and data center expansion slowed in 2022, memory prices plummeted. The recession continued into 2023, forcing major players like Samsung to cut production by 50% to prevent prices from falling below manufacturing costs—a rare and desperate move in an industry where fabs must typically run at full capacity just to justify their value1
.Thomas Coughlin, president of Coughlin Associates and a storage and memory expert, explains that the memory market operates in a highly cyclical pattern. With new fabs costing $15 billion or more, firms hesitate to expand and may only have capital during boom periods
1
. Building such facilities takes 18 months or longer, virtually guaranteeing that new production capacity arrives well after demand surges, flooding markets and depressing prices. After the 2022-2023 downturn, memory companies remained wary of increasing production capacity, resulting in little to no investment in new facilities through most of 2024 and 20251
.
Source: PC Gamer
This lack of investment now collides with explosive demand from nearly 2,000 new data centers either planned or under construction globally
1
. McKinsey forecasts a staggering $7 trillion total investment in new data centers by 2030, with $5.2 trillion dedicated specifically to AI-focused facilities3
.The memory market's commodity nature means prices react almost immediately to supply tightening, demand increases, or buyer sentiment shifts
2
. At DRAMeXchange, the spot price of a 16 Gb DDR5 chip averaged $38, with daily highs reaching $53 and lows at $25. Just one year ago, that same chip cost $4.75 on average, with session lows at $3.70 and highs at $6.602
. Similar price increases have affected 3D NAND memory in recent quarters.TrendForce projects memory revenue growth accelerating dramatically: an 80% increase in 2024, followed by 46% growth in 2025, and a projected 134% surge in 2026
2
. By contrast, the foundry industry—operating under long-term agreements that smooth price fluctuations—saw 19% year-over-year revenue growth in 2024, followed by 25% in 2025 and a projected 25% in 2026.Cloud service providers and AI companies prove less price-sensitive than traditional buyers, enabling memory makers to raise average selling prices more aggressively than in past cycles
2
. The AI boom creates demand not just for powerful training clusters with expensive HBM3E memory, but also for inference systems requiring CPUs, AI accelerators, memory, and storage—demand that shows no signs of declining.Related Stories
The three major players—Micron, SK Hynix, and Samsung—are all building new facilities, but production capacity won't meaningfully increase until 2028 at the earliest
3
. Micron plans to bring a new Singapore fab online in 2027 and is repurposing a Taiwan facility for production in the second half of this year, but both will produce HBM for Nvidia AI GPUs rather than DRAM for PCs3
. Micron's new DRAM factory in New York State won't reach full production until 20303
.SK Hynix is constructing a new HBM factory in Cheongju, China, scheduled for completion in late 2028, along with another HBM facility in Indiana slated for production in late 2028
3
. Samsung has a new fab in Pyeongtaek, South Korea, due online in 20283
. None of these companies have announced specific new DRAM factories beyond Micron's delayed New York facility.Even when new facilities come online, economists remain skeptical that DRAM prices will normalize quickly. Mina Kim, an economist with Mkecon Insights, notes that "prices come down much more slowly and reluctantly than they go up. DRAM today is unlikely to be an exception to this general observation, especially given the insatiable demand for compute"
3
.The supply chain dynamics favor sustained high prices. While new HBM facilities might add to overall production capacity, they don't directly address DRAM shortages for PCs and consumer electronics. The hope is that additional HBM capacity will relieve pressure on existing DRAM production lines and prevent their conversion to HBM facilities
3
. However, with major AI players like Meta, OpenAI, Nvidia, Amazon, and Microsoft accelerating their already massive AI investments in 2025, demand shows no signs of slowing3
.For industries outside AI, the outlook remains challenging. PC manufacturers, consumer electronics makers, and other sectors dependent on affordable computer memory face years of elevated costs and constrained supply. Barring a major collapse in the AI sector, the memory shortage will persist as the semiconductor industry struggles to balance production capacity between traditional DRAM and the lucrative HBM market driving record revenue for memory makers.
Summarized by
Navi
14 Dec 2025•Business and Economy

26 Nov 2025•Business and Economy

03 Dec 2025•Business and Economy

1
Technology

2
Policy and Regulation

3
Health
