16 Sources
16 Sources
[1]
How much of the AI data center boom will be powered by renewable energy?
According to a new report from the International Energy Agency, the world will spend $580 billion on data centers this year -- $40 billion more than will be spent finding new oil supplies. Those numbers help to illustrate some big shifts in the global economy, and comparing data centers and oil seems particularly apt given concerns about how generative AI might accelerate climate change. Kirsten Korosec, Rebecca Bellan, and I discussed the report's findings on the latest episode of TechCrunch's Equity podcast. There's no question that these new data centers are going to be hungry for power, and that they could place even more stress on already taxed electrical grids. But Kirsten pointed to a potential upside, with solar poised to power many of these new projects, which could also create new opportunities for startups pursuing innovative approaches to renewable energy. We also discussed how these projects will be funded, with OpenAI saying it has committed $1.4 trillion to building data centers, Meta committing $600 billion, and Anthropic recently announcing a $50 billion data center plan. You can read a preview of our conversation, edited for length and clarity, below. Kirsten: Here's what I think is the potential upside. So Tim De Chant, who's our climate tech reporter, has done a ton of reporting about not just data centers, but actually how a lot of data centers are turning to renewables because in terms of regulatory [hurdles] and cost, they are the go-to. It's a lot easier to get a permit to throw up a bunch of solar panels adjacent to a data center. So to me, the one upside is that it could really mean a positive for any kind of company that is doing interesting things around renewables or data center design and some of the technology to reduce the global emissions component of it. But of course, the sheer number to me is what really stood out. As a former energy reporter myself, I know how much is spent on trying to find new oil. Rebecca: I mean, it's a lot. And a lot of that's coming from the U.S. I think that report found that half of the electricity demand will be coming from the U.S., and the rest is a mix of China and Europe. And another thing that struck me about it was that most of the data centers are coming to cities, or near cities, like populations of a million people, roughly. So that means there's a lot more challenge with the grid connection and with connection pathways. I think that, to your point, renewables will have to [be a focus] -- it's just good business, it's not because of any environmentally friendly policies. Kirsten: Redwood Materials' new business unit, Redwood Energy, is going to be an interesting company to watch with this. A few months ago, I went to their big reveal, and they're taking the old EV batteries that aren't quite ready to be recycled, and then they're creating these microgrids, and then specifically going after AI data centers. And that, to me, would alleviate the problem or the concern that you just mentioned. The question is: Are other companies going to do this? Are there other Redwood Energies out there that are trying to do the same thing? And how much of an impact could they make? Because I do think that like the pressure on the electrical grid, especially during certain times of the year, like in the middle of the summer, for instance, places like Texas that have rolling brownouts and blackouts, that is going to be a real concern. And it could spur a whole new kind of investment into companies doing what Redwood is doing. Anthony: It also underlines this question about what is that going to do to the spaces that we live in? Even if they're not in cities themselves, I feel like the landscape is definitely going to be transformed by construction at this scale. And then, of course, there's also this question of how much of [the planned data centers are] actually going to get built because there's definitely very ambitious plans that require huge amounts of spending. To start with OpenAI, that's a company that a lot of people have been talking about, how much money are they actually making versus the trillions of dollars of capital commitments they have for the next decade. And then there was this whole controversy over their CFO saying, "The government should backstop our loans to build these data centers." And then she's like, "No, no, no, no, no, I didn't mean backstop, that was a poor choice of words," but it does look like they have been asking for an expansion of tax credits from the CHIPS Act. I think that this is going to be an effort that's not just going to fall on the companies, but also on the government -- or at least that's going to be a question that the government is considering over the next few years.
[2]
The State of AI: Energy is king, and the US is falling behind
In the age of AI, the biggest barrier to progress isn't money but energy. That should be particularly worrying here in the US, where massive data centers are waiting to come online, and it doesn't look as if the country will build the steady power supply or infrastructure needed to serve them all. It wasn't always like this. For about a decade before 2020, data centers were able to offset increased demand with efficiency improvements. Now, though, electricity demand is ticking up in the US, with billions of queries to popular AI models each day -- and efficiency gains aren't keeping pace. With too little new power capacity coming online, the strain is starting to show: Electricity bills are ballooning for people who live in places where data centers place a growing load on the grid. If we want AI to have the chance to deliver on big promises without driving electricity prices sky-high for the rest of us, the US needs to learn some lessons from the rest of the world on energy abundance. Just look at China. China installed 429 GW of new power generation capacity in 2024, more than six times the net capacity added in the US during that time. China still generates much of its electricity with coal, but that makes up a declining share of the mix. Rather, the country is focused on installing solar, wind, nuclear, and gas at record rates. The US, meanwhile, is focused on reviving its ailing coal industry. Coal-fired power plants are polluting and, crucially, expensive to run. Aging plants in the US are also less reliable than they used to be, generating electricity just 42% of the time, compared with a 61% capacity factor in 2014. It's not a great situation. And unless the US changes something, we risk becoming consumers as opposed to innovators in both energy and AI tech. Already, China earns more from exporting renewables than the US does from oil and gas exports.
[3]
Data centers now attract more investment than than finding new oil supplies
If there's any question about whether data centers are driving the global economy, a new report from the International Energy Agency should dispel any doubts. This year, the world will spend $580 billion on data centers, $40 billion more than it will spend on new oil supplies. "This point of comparison provides a telling marker of the changing nature of modern, highly digitalized economies," the agency said in the report. Electricity consumption from AI data centers is expected to grow fivefold by the end of the decade, doubling the total used by all data centers today. Conventional data centers will also consume more energy, though the increase won't be nearly as dramatic. Fully half of that demand growth is expected to occur in the U.S., the IEA said, with the bulk of the remainder occurring in Europe and China. Most new data centers are being developed by large cities with populations exceeding 1 million people, the agency said. One half of those in the pipeline are at least 200 megawatts, and most are being built near other data centers. "This rapid build out of data centers -- especially in clusters and around urban areas -- comes with challenges," the IEA wrote. "Grid congestion and connection queues are increasing in many regions, and connection queues for new data centers are often already long." In some markets like northern Virginia, grid connection waits can be as long as a decade. In Europe, Dublin has paused new interconnection requests entirely until 2028. The supply chain for the grid is another pinch point, with cables, critical minerals, gas turbines, and transformers delaying upgrades, the agency said. Some companies, like Amperesand and Heron Power, are working on solid-state transformers, which promise to be a significant upgrade over the century-old technology that currently manages parts of the grid. They can integrate renewables more deftly, react swiftly to grid instabilities, and can handle a range of conversions. But first deployments are at least a year or two away, and it'll take a while to ramp up production. The IEA expects renewables to supply the majority of new data center power by 2035, regardless of whether countries maintain their current policies or more aggressively pursue lower emissions. Solar, which has come down in cost significantly in recent years, has become a particular favorite of developers. Over the next decade, around 400 terawatt-hours of electricity for data centers will come from renewables, whereas natural gas will supply around 220 terawatt-hours. If small modular nuclear power plants deliver on their promises, the IEA expects they'll contribute 190 terawatt-hours to data centers.
[4]
OpenAI's colossal AI data center targets would consume as much electricity as entire nation of India -- 250GW target would require 30 million GPUs annually to ensure continuous operation, emit twice as much carbon dioxide as ExxonMobil
OpenAI CEO Sam Altman released an internal memo last September 2025, stating that he plans to build up to 250 gigawatts of compute capacity by 2033. According to Truthdig, this is equivalent to the electricity required to power the entire nation of India and its 1.5 billion citizens. It would also emit twice the carbon dioxide that ExxonMobil produces, which the report says is the current "largest non-state carbon emitter" in the world. Aside from carbon emissions from its data centers, the writer, Alistair Alexander, estimates that the 250 GW capacity is enough to support 60 million Nvidia GB300 GPUs. This means that OpenAI would have to order 30 million GPUs annually to ensure continuous operation. While Alexander claims this is because the cards are run 24/7, 365 days a year, and therefore have a short lifespan, the two-year life cycle of a GPU actually refers to its economic value, which drops with the release of newer generations of products. Either way, the number of GPUs required to power OpenAI's ambition will be staggering. This estimate is for just one company: other tech giants are also planning their own massive AI data centers -- for example, Elon Musk's xAI wants to have 50 million H100-equivalent AI GPUs by 2030, which will require around 5GW of power. This massive build-out, which is happening all over the globe, is squeezing power supply, causing a spike in electricity prices, and reducing power quality for households. There's also the question of its impact on the water supply, especially as the massive compute capacity requires astronomical amounts of cooling. Alexander also looked upstream and investigated the impact of chip manufacturing on the environment. The massive demand for AI processors and the billions of dollars companies are willing to spend on them has led to an explosion of new fabs. In the past two years, construction on 97 new fabs has started all over the globe, with Taiwan Semiconductor Manufacturing Company (TSMC) and Samsung among them. Much like the data centers, these chip factories also require metric tons of water and massive amounts of electricity to operate. More than that, the processes for making the most advanced chips also often require toxic chemicals. For example, the report says that TSMC's Fab 25 would require at least 1GW of power -- enough power for 750,000 households in Taiwan. Environmental Rights Foundation deputy CEO Po-Jen Hsu also said that Fab 25 would also use 100,000 metric tons of water daily, or about the same amount that 196,000 Taichung residents would use in a day. SHARPS, the semiconductor labor organization in South Korea, has also said that some workers in Samsung factories were suffering from various forms of cancer, all of which were linked with the chemicals used in their workplaces. The race for AI supremacy is putting a lot of demand on the limited resources we have on Earth. The issues aren't just limited to data centers, too. Instead, it goes all the way up the supply chain -- from manufacturing the most advanced chips to the mining operations needed to extract the rare earths and other materials needed to make these GPUs. Tech companies are pouring billions of dollars into hardware and infrastructure to extend their capabilities, while nations are engaging in a trade war to limit the advance of their rivals. The report concludes: "As Silicon Valley CEOs anxiously figure how much computing it will take to propel artificial intelligence forward, the real question we should be asking is how much more artificial intelligence the planet can take."
[5]
Silicon Valley data centers totalling nearly 100MW could 'sit empty for years' due to lack of power -- huge installations are idle because Santa Clara can't cope with surging electricity demands
Two major facilities built for AI-era workloads remain unpowered while the city races to expand its electricity supply. In the heart of Silicon Valley, two freshly built data centers designed for the world's most power-hungry computing workloads are standing empty. Digital Realty's four-story SJC37 facility and Stack Infrastructure's SVY02A campus in Santa Clara, California, were both constructed to host tens of megawatts of high-density IT hardware. Instead, they're waiting for electricity. According to a Bloomberg report, both projects are complete but idle, with no firm timeline for full energization. Digital Realty's 430,000-square-foot site was built for 48 megawatts of critical load. Stack's nearby SVY02A campus -- also designed for 48 megawatts -- includes its own substation and eight data halls. Together, they represent nearly 100 megawatts of capacity ready for servers, accelerators, and networking gear that cannot be switched on until the local grid catches up. According to the report, both "may sit empty for years." Santa Clara's publicly owned utility, Silicon Valley Power (SVP), is racing to expand supply to meet surging demand from data center operators. The city has 57 active or in-progress facilities and is investing $450 million in grid upgrades scheduled for completion by 2028. SVP told Bloomberg it is sequencing power delivery among customers as new substations and transmission lines come online. The city's challenges mirror those emerging across the country. Northern Virginia, the largest data center market in the U.S., has faced multi-year connection delays as utilities struggle to reinforce high-voltage infrastructure. Regions in the Pacific Northwest and the Southeast are also reporting wait times of two to five years for new capacity. Recently, Microsoft admitted it has GPUs sitting idle because it has no power for them. Silicon Valley remains prime real estate for operators chasing low-latency proximity to users and AI developers. Nvidia's headquarters sits just minutes away from both idle sites, a reminder that even the global leader in GPU compute can't accelerate grid construction. The scale of modern AI clusters, often measured in hundreds of megawatts, is pushing local networks to their limits. Digital Realty and Stack both told Bloomberg they are coordinating with SVP to phase in power delivery as upgrades progress. But with AI infrastructure expanding faster than transmission projects can be approved, the gap between completed buildings and available electricity is likely to widen. Although the servers are ready, the power isn't.
[6]
AI demand is leading to major data center expansions - but do they have the power to fully operate?
Developers are looking at non-traditional locations for power and land Ongoing and evolving AI use has created a surge in demand for data centers, creating worrying sustainability impacts, but new research has revealed we might not even have enough energy to power them in the first place. New research from Savills notes only 850MW of power has been delivered across the EMEA region in 2025 to date, 11% less than last year. This is particularly concerning, as IDC research projects AI spending could increase to $144.6 billion by 2030 a four-year compound annual growth rate of over 30%. Savills notes a 12% rise in live capacity over the past year across established hubs, like France, Germany, the UK, Ireland and the Netherlands. EMEA Data Center Advisory Director Cameron Bell projected "further upward pressure on pricing" throughout the remainder of 2025: "The persistent imbalance between surging demand and restricted supply continues to underpin rental values." New take-up now stands at 845MW, with around a quarter of take-up now pre-let compared with less than one-fifth three years ago. Occupancy rates also rose to 91% in Q3 2025, up four percentage points in three years. Looking ahead, two in five data centers could face power constraints by 2027 per Gartner forecasts, with AI-optimized server power needs expected to hit 500TWh - a 2.5x increase compared with 2023. Companies are also being forced to balance these power concerns with rising construction costs, currently averaging $7.3-13.3 million per megawatt. Labor shortages, land scarcity and supply chain issues were blamed for 17-28% rises in Copenhagen, Stockholm, Warsaw and Vienna. This could be why non-traditional locations like Portugal (60%), Saudi Arabia (49%), Spain (25%), the UAW (20%) and Sweden (17%) saw the biggest increases in live capacity compared with the 12% rise saw across established hubs. Looking ahead, developers have been forging stronger supplier relationships and exploring markets with more accessible power and land to respond to ongoing demand.
[7]
The AI revolution has a power problem
In the race for AI dominance, American tech giants have the money and the chips, but their ambitions have hit a new obstacle: electric power. "The biggest issue we are now having is not a compute glut, but it's the power and...the ability to get the builds done fast enough close to power," Microsoft CEO Satya Nadella acknowledged on a recent podcast with OpenAI chief Sam Altman. "So if you can't do that, you may actually have a bunch of chips sitting in inventory that I can't plug in," Nadella added. Echoing the 1990s dotcom frenzy to build internet infrastructure, today's tech giants are spending unprecedented sums to construct the silicon backbone of the revolution in artificial intelligence. Google, Microsoft, AWS (Amazon), and Meta (Facebook) are drawing on their massive cash reserves to spend roughly $400 billion in 2025 and even more in 2026 -- backed for now by enthusiastic investors. All this cash has helped alleviate one initial bottleneck: acquiring the millions of chips needed for the computing power race, and the tech giants are accelerating their in-house processor production as they seek to chase global leader Nvidia. These will go into the racks that fill the massive data centers -- which also consume enormous amounts of water for cooling. Building the massive information warehouses takes an average of two years in the United States; bringing new high-voltage power lines into service takes five to 10 years. Energy wall The "hyperscalers," as major tech companies are called in Silicon Valley, saw the energy wall coming. A year ago, Virginia's main utility provider, Dominion Energy, already had a data-center order book of 40 gigawatts -- equivalent to the output of 40 nuclear reactors. The capacity it must deploy in Virginia, the world's largest cloud computing hub, has since risen to 47 gigawatts, the company announced recently. Already blamed for inflating household electricity bills, data centers in the United States could account for 7% to 12% of national consumption by 2030, up from 4% today, according to various studies. But some experts say the projections could be overblown. "Both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use," Jonathan Koomey, a renowned expert from UC Berkeley, warned in September. As with the late 1990s internet bubble, "many data centers that are talked about and proposed and in some cases even announced will never get built." Emergency coal If the projected growth does materialize, it could create a 45-gigawatt shortage by 2028 -- equivalent to the consumption of 33 million American households, according to Morgan Stanley. Several US utilities have already delayed the closure of coal plants, despite coal being the most climate-polluting energy source. And natural gas, which powers 40% of data centers worldwide, according to the International Energy Agency, is experiencing renewed favor because it can be deployed quickly. In the US state of Georgia, where data centers are multiplying, one utility has requested authorization to install 10 gigawatts of gas-powered generators. Some providers, as well as Elon Musk's startup xAI, have rushed to purchase used turbines from abroad to build capability quickly. Even recycling aircraft turbines, an old niche solution, is gaining traction. "The real existential threat right now is not a degree of climate change. It's the fact that we could lose the AI arms race if we don't have enough power," Interior Secretary Doug Burgum argued in October. Nuclear, solar, and space? Tech giants are quietly downplaying their climate commitments. Google, for example, promised net-zero carbon emissions by 2030 but removed that pledge from its website in June. Instead, companies are promoting long-term projects. Amazon is championing a nuclear revival through Small Modular Reactors (SMRs), an as-yet experimental technology that would be easier to build than conventional reactors. Google plans to restart a reactor in Iowa in 2029. And the Trump administration announced in late October an $80 billion investment to begin construction on ten conventional reactors by 2030. Hyperscalers are also investing heavily in solar power and battery storage, particularly in California and Texas. The Texas grid operator plans to add approximately 100 gigawatts of capacity by 2030 from these technologies alone. Finally, both Elon Musk, through his Starlink program, and Google have proposed putting chips in orbit in space, powered by solar energy. Google plans to conduct tests in 2027.
[8]
The AI revolution has a power problem
San Francisco (United States) (AFP) - In the race for AI dominance, American tech giants have the money and the chips, but their ambitions have hit a new obstacle: electric power. "The biggest issue we are now having is not a compute glut, but it's the power and...the ability to get the builds done fast enough close to power," Microsoft CEO Satya Nadella acknowledged on a recent podcast with OpenAI chief Sam Altman. "So if you can't do that, you may actually have a bunch of chips sitting in inventory that I can't plug in," Nadella added. Echoing the 1990s dotcom frenzy to build internet infrastructure, today's tech giants are spending unprecedented sums to construct the silicon backbone of the revolution in artificial intelligence. Google, Microsoft, AWS (Amazon), and Meta (Facebook) are drawing on their massive cash reserves to spend roughly $400 billion in 2025 and even more in 2026 -- backed for now by enthusiastic investors. All this cash has helped alleviate one initial bottleneck: acquiring the millions of chips needed for the computing power race, and the tech giants are accelerating their in-house processor production as they seek to chase global leader Nvidia. These will go into the racks that fill the massive data centers -- which also consume enormous amounts of water for cooling. Building the massive information warehouses takes an average of two years in the United States; bringing new high-voltage power lines into service takes five to 10 years. Energy wall The "hyperscalers," as major tech companies are called in Silicon Valley, saw the energy wall coming. A year ago, Virginia's main utility provider, Dominion Energy, already had a data-center order book of 40 gigawatts -- equivalent to the output of 40 nuclear reactors. The capacity it must deploy in Virginia, the world's largest cloud computing hub, has since risen to 47 gigawatts, the company announced recently. But some experts say the projections could be overblown. "Both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use," Jonathan Koomey, a renowned expert from UC Berkeley, warned in September. As with the late 1990s internet bubble, "many data centers that are talked about and proposed and in some cases even announced will never get built." Emergency coal If the projected growth does materialize, it could create a 45-gigawatt shortage by 2028 -- equivalent to the consumption of 33 million American households, according to Morgan Stanley. Several US utilities have already delayed the closure of coal plants, despite coal being the most climate-polluting energy source. And natural gas, which powers 40 percent of data centers worldwide, according to the International Energy Agency, is experiencing renewed favor because it can be deployed quickly. In the US state of Georgia, where data centers are multiplying, one utility has requested authorization to install 10 gigawatts of gas-powered generators. Some providers, as well as Elon Musk's startup xAI, have rushed to purchase used turbines from abroad to build capability quickly. Even recycling aircraft turbines, an old niche solution, is gaining traction. "The real existential threat right now is not a degree of climate change. It's the fact that we could lose the AI arms race if we don't have enough power," Interior Secretary Doug Burgum argued in October. Nuclear, solar, and space? Tech giants are quietly downplaying their climate commitments. Google, for example, promised net-zero carbon emissions by 2030 but removed that pledge from its website in June. Instead, companies are promoting long-term projects. Amazon is championing a nuclear revival through Small Modular Reactors (SMRs), an as-yet experimental technology that would be easier to build than conventional reactors. Google plans to restart a reactor in Iowa in 2029. And the Trump administration announced in late October an $80 billion investment to begin construction on ten conventional reactors by 2030. Hyperscalers are also investing heavily in solar power and battery storage, particularly in California and Texas. The Texas grid operator plans to add approximately 100 gigawatts of capacity by 2030 from these technologies alone. Finally, both Elon Musk, through his Starlink program, and Google have proposed putting chips in orbit in space, powered by solar energy. Google plans to conduct tests in 2027.
[9]
Big Tech's climate strategists feeling strain of AI power needs
Weeks after ChatGPT was unleashed on the world in November 2022, sustainability executives at Microsoft Corp. realized they had a big problem. On the tech giant's 500-acre campus in Redmond, Washington, teams began holding regular "triage" meetings to confront serious questions posed by the artificial intelligence boom: Where would the company find the gigawatts -- just one gigawatt can power nearly 750,000 US homes -- needed for data centers? And how could Microsoft possibly secure that extra energy while still making progress toward a long-standing goal of going carbon-negative? The AI discussions were "interesting and terrifying all at the same time," said Brian Janous, who served until August 2023 as Microsoft's vice president of energy. Microsoft and other major tech companies, he said, had to "look at the climate commitments they set and say, 'Can I still do this?'" Nearly three years later, Microsoft and rivals including Amazon.com Inc., Meta Platforms Inc. and Alphabet Inc.'s Google are still struggling to answer that question. On one hand, they're trying to obtain every electron possible to power their trillion-dollar bet on AI. On the other, they're trying to stay true to a goal of achieving net-zero carbon emissions by 2040 or sooner. "There's no question that the current push to develop AI infrastructure is putting a strain on the climate commitments of the big tech companies, all of which were made prior to the advent of AI," Janous said in an interview. After leaving Microsoft, he co-founded Cloverleaf Infrastructure, which partners with utilities to develop clean-powered sites that supply the largest data-center providers. US President Donald Trump isn't making this challenge any easier. Since taking office, he's slashed federal funding for green initiatives, such as wind and solar, and signaled his support for powering AI with generators that use fossil fuels, including aging coal-fired plants. Wary of irritating their biggest ally in Washington, tech leaders so far have refrained from publicly challenging the president over his campaign against renewables. For now, the so-called hyperscalers continue to buy clean energy at a record pace, with Meta, Amazon, Google and Microsoft ranking as the biggest corporate signers of power purchase agreements with renewable suppliers. Together they accounted for 9.6 gigawatts of US clean energy purchases in the first half of 2025, amounting to 40% of the global total, according to the latest BloombergNEF data. That number falls woefully short of the 362 gigawatts of additional power the industry is projected to need worldwide by 2035 to sustain its growing data center fleet, according to BNEF. With enormous pressure from Wall Street to deliver on AI investments, companies can't afford to let energy be a limiting factor and are pursuing an all-of-the-above strategy on electricity sources. AI's impact is already showing up in sustainability reports. Meta, Google, Amazon and Microsoft disclosed that their carbon emissions went up 64%, 51%, 33% and 23% respectively in their latest climate filings compared to their benchmarks from before ChatGPT's release. Microsoft explicitly blamed "growth-related factors such as AI and cloud expansion." Inside Microsoft, tension between longstanding climate pledges and the insatiable energy needs of AI left the sustainability teams mired in uncertainty, according to two former managers who left the company earlier this year. Speaking on condition of anonymity, the former managers described Microsoft's climate priorities as constantly shifting in the face of its appetite for electricity. A key source of strain was over concerns that Microsoft risked a public backlash if it appeared to be abandoning its climate goals in any way, creating a "suffocating level of control" that paralyzed staff, one of the former managers said. Reports were dissected to ensure they adhered to an evolving company line, with individual words in memos and other documents turning into land mines that could "blow up" conversations, the person said. Adding to the stress, the other former manager said, were industrywide workforce reductions in 2023 that cut nearly 200,000 jobs, which destabilized climate teams as they worked on one of the toughest problems in tech: how to procure enough energy, sustainably. In a statement, Microsoft's chief sustainability officer, Melanie Nakagawa, said the company "remains committed to meeting our climate goals of being carbon negative, water positive and zero waste by 2030 while protecting ecosystems. As we learn and adapt, we're continuing to expand our global clean energy portfolio, building markets by investing in new climate technology solutions, and empowering others with technology to build a more sustainable future." -- -- - Biggest Bottleneck On a recent podcast, Microsoft Chief Executive Officer Satya Nadella said the supply of power, rather than the availability of semiconductors, accounted for the biggest bottleneck in data center capacity. By some estimates, the energy needs of existing and planned AI infrastructure in the US can't be met with current supply. Dave Stangis, a senior executive at Apollo Global Management who has led the firm's sustainability strategy for the past four years, said last month that the amount of energy required to power AI data centers is so vast that meeting that need may be more than a lifetime away. As a result, big tech companies have been buying more nuclear and geothermal energy to satisfy the needs of their expanding AI infrastructure, according to BNEF data. At the same time, haunted by the risk of losing power for even a minute and driven by the desire to win the AI race, some companies are also exploring what's known as behind-the-meter power, where a generation plant sits in the data center's backyard. "They want a new resource, something that's not already on the grid, that's not already spoken for," said Elaine Walsh, who leads the power group for the law firm Baker Botts. She added that "almost all" of the new development work she does is for gas power. -- -- - Meta's Hyperion Project That's the strategy Meta is using as it attempts to get its massive data center in Louisiana up and running as quickly as possible. The project, dubbed Hyperion, is a 4 million-square-foot complex on 2,250 acres in rural Louisiana that's expected to consume as much as five gigawatts of electricity. This summer, Meta Chief Executive Officer Mark Zuckerberg posted a graphic depicting Hyperion engulfing most of Manhattan -- an image that Trump later displayed at a White House Cabinet meeting, saying the facility would cost $50 billion. To support the Meta site, Entergy Corp. received regulatory approval in August to construct three gas plants capable of producing about 2.3 gigawatts. Last month, in addition to those three plants, the utility applied to tie new natural gas generation to the grid in Louisiana to meet increasing demand from data centers and industrial projects, including from Meta and other hyperscalers. "From the moment that the Richland Parish data center came onto the picture, we've been planning for and executing to secure clean energy to support that site," Urvi Parekh, Meta's head of global energy, said in an interview. "What we measure is what is the carbon intensity of the electricity that is serving our data center and then working through the greenhouse gas protocol rules." Meta also said in a statement that it plans to add enough clean and renewable energy to match the total electricity use at Richland. To do so, the company said it's working with Entergy to bring 1.5 gigawatts of renewable energy to the Louisiana grid and that it's launched other clean and renewable energy projects across the state, including three focused on solar. Companies like Meta often buy carbon offsets and renewable energy certificates to balance the emissions tied to their operations, but both of those practices have been decried over the years for varying reasons. Offsets can be notoriously difficult to verify, and renewable energy certificates have been assailed for failing to achieve real emissions reduction or drive renewable energy generation. Meta has maintained that the majority of its renewable energy portfolio actually comes in the form of real, long-term contracts such as power purchase agreements and that "only a small percentage" -- less than 5% of its reported renewable energy purchases in 2023, for example -- were tied to short-term, unbundled renewable energy certificates. -- -- - Nuclear Option Meta's use of gas to power its new data center while still purchasing huge amounts of clean energy reflects a balancing act across the industry. Last month, Google announced a first-of-its kind agreement to buy almost all of the electricity from a gas plant in Illinois while supplying the facility with carbon capture and storage equipment. The technology will capture and store around 90% of the CO2 emissions from the plant, according to a Google release. But some skeptics have cautioned that emissions capture is neither economical nor feasible at scale, and a nationwide carbon storage network would require as many as 96,000 miles of new pipelines, according to Energy Department estimates. Another option on the table is nuclear energy, which can provide huge amounts of round-the-clock power, free of carbon emissions. Google agreed last month to buy nuclear power from a plant that NextEra Energy Inc. plans to restart in Iowa. But nuclear power is also expensive, and supply chain issues mean that new nuclear plants will take years to build and bring online. Even though the NextEra generator is already built -- and was only shuttered in 2020 -- the company won't start delivering power to Google's data centers until 2029. To link data centers to power sources faster, US Energy Secretary Chris Wright urged the Federal Energy Regulatory Commission last month to expedite reviews for grid connections, according to documents reviewed by Bloomberg News. As part of a draft rule Wright sent to the agency, those reviews would shrink to 60 days, a seismic shift for a process that currently can take years. Under the proposal, data centers could win a speedy review if they include new power plants or agree to curtail usage in response to grid strains during high-demand periods such as heat waves. However, a data center vying to locate next to an existing power plant would require a study to determine if that generation capacity is needed to maintain grid reliability. -- -- - Trump Headwinds Complicating matters is Trump, who has made the industry's AI ambitions central to his second-term economic agenda. In September, he welcomed leaders including Nadella and Zuckerberg for a White House dinner where he hailed tech companies' far-reaching infrastructure spending plans and promised help with permitting for energy projects. At the same time, Trump has roiled the tech companies' quest for electricity by attacking green energy with a vengeance and assailing global warming as "the greatest con job ever" during a United Nations speech. The administration has stopped or delayed wind and solar initiatives -- some of which were near completion -- and declined to send delegates to the COP30 climate conference in Brazil. One of Trump's biggest moves against renewables was to use the massive tax bill he signed in July to strip away clean energy incentives created through his predecessor Joe Biden's Inflation Reduction Act. Total annual deployment of new solar, wind and energy storage facilities in 2035 will be 21% lower -- or 227 gigawatts less -- than it would have been without the Trump tax law, according to BNEF forecasts. Backing out of wind and solar projects as Trump envisions is problematic because doing so would force companies to use other power sources, like natural gas, that are not necessarily cheaper or faster to get into service, said Janous. He disputed the narrative that renewables can't sufficiently power data centers, arguing that the grid should be made more flexible to better use what solar and wind can offer. Big tech companies likewise can't simply discard clean energy commitments that have been a decade in the making for some firms, which have put time and money into reaching them, said BNEF analyst Nayel Brihi. "Dropping out now would first of all hurt their branding a lot, but also would just not make a lot of business sense for them because of all the time and effort that's been put in already," he said. -- With assistance from Riley Griffin, Josh Saul and Mark Chediak.
[10]
IEA: Data center investment will exceed oil supply spending this year
AI data centers are expected to drive a fivefold increase in electricity demand by 2030, doubling today's total data center consumption. Global data center investments will surpass new oil supply expenditures this year, marking a significant economic shift according to a new International Energy Agency (IEA) report. The world is projected to spend $580 billion on data centers this year, exceeding the $540 billion allocated for new oil supplies. The IEA stated this comparison provides "a telling marker of the changing nature of modern, highly digitalized economies." Electricity consumption by AI data centers is anticipated to increase fivefold by 2030, effectively doubling the current total energy usage of all data centers. Conventional data centers will also experience increased energy consumption, though less dramatically. The IEA projects half of this demand growth will occur in the U.S., with Europe and China accounting for most of the remaining increase. Most new data centers are under development in cities with populations exceeding one million. Approximately half of these planned facilities will be at least 200 megawatts, with many situated near existing data center clusters. The IEA notes that this rapid build-out, particularly in urban and clustered areas, presents challenges. "Grid congestion and connection queues are increasing in many regions, and connection queues for new data centers are often already long," the agency reported. Grid connection waits can extend up to a decade in certain markets, such as northern Virginia. Dublin, Ireland, has halted new interconnection requests entirely until 2028. Supply chain constraints represent another critical issue, with components such as cables, critical minerals, gas turbines, and transformers delaying grid upgrades. Companies like Amperesand and Heron Power are developing solid-state transformers. These aim to upgrade the current century-old grid technology by integrating renewables more effectively, responding rapidly to instabilities, and handling various conversions. Initial deployments are still one to two years away, with production ramp-up requiring additional time. The IEA expects renewables to fulfill the majority of new data center power requirements by 2035. This projection holds regardless of whether countries maintain current policies or pursue more aggressive emission reduction strategies. Solar power, with its recently reduced costs, has become a preferred option for developers. Over the next decade, approximately 400 terawatt-hours (TWh) of electricity for data centers will come from renewables. Natural gas is expected to provide around 220 TWh. Should small modular nuclear power plants meet their projected capabilities, the IEA anticipates they could contribute 190 TWh to data centers.
[11]
Big tech's climate strategists feeling strain of AI power needs
Companies are buying clean energy at record rates. However, they also explore other sources like natural gas and nuclear power. Weeks after ChatGPT was unleashed on the world in November 2022, sustainability executives at Microsoft Corp. realized they had a big problem. On the tech giant's 500-acre campus in Redmond, Washington, teams began holding regular "triage" meetings to confront serious questions posed by the artificial intelligence boom: Where would the company find the gigawatts -- just one gigawatt can power nearly 750,000 US homes -- needed for data centers? And how could Microsoft possibly secure that extra energy while still making progress toward a long-standing goal of going carbon-negative? The AI discussions were "interesting and terrifying all at the same time," said Brian Janous, who served until August 2023 as Microsoft's vice president of energy. Microsoft and other major tech companies, he said, had to "look at the climate commitments they set and say, 'Can I still do this?'" Nearly three years later, Microsoft and rivals including Amazon.com Inc., Meta Platforms Inc. and Alphabet Inc.'s Google are still struggling to answer that question. On one hand, they're trying to obtain every electron possible to power their trillion-dollar bet on AI. On the other, they're trying to stay true to a goal of achieving net-zero carbon emissions by 2040 or sooner. "There's no question that the current push to develop AI infrastructure is putting a strain on the climate commitments of the big tech companies, all of which were made prior to the advent of AI," Janous said in an interview. After leaving Microsoft, he co-founded Cloverleaf Infrastructure, which partners with utilities to develop clean-powered sites that supply the largest data-center providers. US President Donald Trump isn't making this challenge any easier. Since taking office, he's slashed federal funding for green initiatives, such as wind and solar, and signaled his support for powering AI with generators that use fossil fuels, including aging coal-fired plants. Wary of irritating their biggest ally in Washington, tech leaders so far have refrained from publicly challenging the president over his campaign against renewables. For now, the so-called hyperscalers continue to buy clean energy at a record pace, with Meta, Amazon, Google and Microsoft ranking as the biggest corporate signers of power purchase agreements with renewable suppliers. Together they accounted for 9.6 gigawatts of US clean energy purchases in the first half of 2025, amounting to 40% of the global total, according to the latest BloombergNEF data. That number falls woefully short of the 362 gigawatts of additional power the industry is projected to need worldwide by 2035 to sustain its growing data center fleet, according to BNEF. With enormous pressure from Wall Street to deliver on AI investments, companies can't afford to let energy be a limiting factor and are pursuing an all-of-the-above strategy on electricity sources. AI's impact is already showing up in sustainability reports. Meta, Google, Amazon and Microsoft disclosed that their carbon emissions went up 64%, 51%, 33% and 23% respectively in their latest climate filings compared to their benchmarks from before ChatGPT's release. Microsoft explicitly blamed "growth-related factors such as AI and cloud expansion." Inside Microsoft, tension between longstanding climate pledges and the insatiable energy needs of AI left the sustainability teams mired in uncertainty, according to two former managers who left the company earlier this year. Speaking on condition of anonymity, the former managers described Microsoft's climate priorities as constantly shifting in the face of its appetite for electricity. A key source of strain was over concerns that Microsoft risked a public backlash if it appeared to be abandoning its climate goals in any way, creating a "suffocating level of control" that paralyzed staff, one of the former managers said. Reports were dissected to ensure they adhered to an evolving company line, with individual words in memos and other documents turning into landmines that could "blow up" conversations, the person said. Adding to the stress, the other former manager said, were industry-wide workforce reductions in 2023 that cut nearly 200,000 jobs, which destabilized climate teams as they worked on one of the toughest problems in tech: how to procure enough energy, sustainably. In a statement, Microsoft's chief sustainability officer, Melanie Nakagawa, said the company "remains committed to meeting our climate goals of being carbon negative, water positive and zero waste by 2030 while protecting ecosystems. As we learn and adapt, we're continuing to expand our global clean energy portfolio, building markets by investing in new climate technology solutions, and empowering others with technology to build a more sustainable future." On a recent podcast, Microsoft Chief Executive Officer Satya Nadella said the supply of power, rather than the availability of semiconductors, accounted for the biggest bottleneck in data center capacity. By some estimates, the energy needs of existing and planned AI infrastructure in the US can't be met with current supply. Dave Stangis, a senior executive at Apollo Global Management who has led the firm's sustainability strategy for the past four years, said last month that the amount of energy required to power AI data centers is so vast that meeting that need may be more than a lifetime away. As a result, big tech companies have been buying more nuclear and geothermal energy to satisfy the needs of their expanding AI infrastructure, according to BNEF data. At the same time, haunted by the risk of losing power for even a minute and driven by the desire to win the AI race, some companies are also exploring what's known as behind-the-meter power, where a generation plant sits in the data center's backyard. "They want a new resource, something that's not already on the grid, that's not already spoken for," said Elaine Walsh, who leads the power group for the law firm Baker Botts. She added that "almost all" of the new development work she does is for gas power. That's the strategy Meta is using as it attempts to get its massive data center in Louisiana up and running as quickly as possible. The project, dubbed Hyperion, is a 4 million-square-foot complex on 2,250 acres in rural Louisiana that's expected to consume as much as five gigawatts of electricity. This summer, Meta Chief Executive Officer Mark Zuckerberg posted a graphic depicting Hyperion engulfing most of Manhattan -- an image that Trump later displayed at a White House Cabinet meeting, saying the facility would cost $50 billion. To support the Meta site, Entergy Corp. received regulatory approval in August to construct three gas plants capable of producing about 2.3 gigawatts. Last month, in addition to those three plants, the utility applied to tie new natural gas generation to the grid in Louisiana to meet increasing demand from data centers and industrial projects, including from Meta and other hyperscalers. "From the moment that the Richland Parish data center came onto the picture, we've been planning for and executing to secure clean energy to support that site," Urvi Parekh, Meta's head of global energy, said in an interview. "What we measure is what is the carbon intensity of the electricity that is serving our data center and then working through the greenhouse gas protocol rules." Meta also said in a statement that it plans to add enough clean and renewable energy to match the total electricity use at Richland. To do so, the company said it's working with Entergy to bring 1.5 gigawatts of renewable energy to the Louisiana grid and that it's launched other clean and renewable energy projects across the state, including three focused on solar. Companies like Meta often buy carbon offsets and renewable energy certificates to balance the emissions tied to their operations, but both of those practices have been decried over the years for varying reasons. Offsets can be notoriously difficult to verify, and renewable energy certificates have been assailed for failing to achieve real emissions reduction or drive renewable energy generation. Meta has maintained that the majority of its renewable energy portfolio actually comes in the form of real, long-term contracts such as power purchase agreements and that "only a small percentage" -- less than 5% of its reported renewable energy purchases in 2023, for example -- were tied to short-term, unbundled renewable energy certificates. Meta's use of gas to power its new data center while still purchasing huge amounts of clean energy reflects a balancing act across the industry. Last month, Google announced a first-of-its kind agreement to buy almost all of the electricity from a gas plant in Illinois while supplying the facility with carbon capture and storage equipment. The technology will capture and store around 90% of the CO2 emissions from the plant, according to a Google release. But some skeptics have cautioned that emissions capture is neither economical nor feasible at scale, and a nationwide carbon storage network would require as many as 96,000 miles of new pipelines, according to Energy Department estimates. Another option on the table is nuclear energy, which can provide huge amounts of round-the-clock power, free of carbon emissions. Google agreed last month to buy nuclear power from a plant that NextEra Energy Inc. plans to restart in Iowa. But nuclear power is also expensive, and supply chain issues mean that new nuclear plants will take years to build and bring online. Even though the NextEra generator is already built -- and was only shuttered in 2020 -- the company won't start delivering power to Google's data centers until 2029. To link data centers to power sources faster, US Energy Secretary Chris Wright urged the Federal Energy Regulatory Commission last month to expedite reviews for grid connections, according to documents reviewed by Bloomberg News. As part of a draft rule Wright sent to the agency, those reviews would shrink to 60 days, a seismic shift for a process that currently can take years. Under the proposal, data centers could win a speedy review if they include new power plants or agree to curtail usage in response to grid strains during high-demand periods such as heatwaves. However, a data center vying to locate next to an existing power plant would require a study to determine if that generation capacity is needed to maintain grid reliability. Complicating matters is Trump, who has made the industry's AI ambitions central to his second-term economic agenda. In September, he welcomed leaders including Nadella and Zuckerberg for a White House dinner where he hailed tech companies' far-reaching infrastructure spending plans and promised help with permitting for energy projects. At the same time, Trump has roiled the tech companies' quest for electricity by attacking green energy with a vengeance and assailing global warming as "the greatest con job ever" during a United Nations speech. The administration has stopped or delayed wind and solar initiatives -- some of which were near completion -- and declined to send delegates to the COP30 climate conference in Brazil. One of Trump's biggest moves against renewables was to use the massive tax bill he signed in July to strip away clean energy incentives created through his predecessor Joe Biden's Inflation Reduction Act. Total annual deployment of new solar, wind and energy storage facilities in 2035 will be 21% lower -- or 227 gigawatts less -- than it would have been without the Trump tax law, according to BNEF forecasts. Backing out of wind and solar projects as Trump envisions is problematic because doing so would force companies to use other power sources, like natural gas, that are not necessarily cheaper or faster to get into service, said Janous. He disputed the narrative that renewables can't sufficiently power data centers, arguing that the grid should be made more flexible to better use what solar and wind can offer. Big tech companies likewise can't simply discard clean energy commitments that have been a decade in the making for some firms, which have put time and money into reaching them, said BNEF analyst Nayel Brihi. "Dropping out now would first of all hurt their branding a lot, but also would just not make a lot of business sense for them because of all the time and effort that's been put in already," he said.
[12]
Why the world is spending more on AI data centers than oil - $580 billion invested in 2025
AI data centers investment 2025: Global spending on data centers has surpassed new oil supplies, reaching $580 billion this year. This surge is driven by the massive energy demands of AI, with consumption expected to quintuple by 2030. The US leads this growth, facing challenges like grid congestion and supply chain bottlenecks as the world looks to renewables by 2035. AI data centers investment 2025: The world is spending more on data centers than on new oil supplies, signaling a major shift in the global economy. According to a new report from the International Energy Agency (IEA), spending on data centers this year will reach $580 billion, which is $40 billion more than what will be spent on new oil production, as reported by TechCrunch. The IEA said, "This point of comparison provides a telling marker of the changing nature of modern, highly digitalized economies," as quoted in the report. The report highlights the growing energy demands of AI and conventional data centers. Electricity consumption from AI data centers alone is expected to grow fivefold by the end of the decade, doubling the total energy used by all data centers today, as per the TechCrunch report. Conventional data centers will also see increased consumption, though not as dramatically. The United States is expected to account for half of this surge in demand, with Europe and China making up most of the remainder, reported TechCrunch. Most new data centers are being built in cities with populations over 1 million, and half of the upcoming facilities will have capacities of at least 200 megawatts, often clustered near existing centers, as per the TechCrunch report. ALSO READ: BTC price on edge ahead of CPI report - will inflation data send Bitcoin soaring or crashing? IEA noted that, "This rapid build out of data centers -- especially in clusters and around urban areas -- comes with challenges," adding, "Grid congestion and connection queues are increasing in many regions, and connection queues for new data centers are often already long," as quoted in the report. Supply chain bottlenecks are also slowing progress. Cables, critical minerals, gas turbines, and transformers are delaying upgrades needed to support the growing infrastructure, as per the TechCrunch report. Some companies, like Amperesand and Heron Power, are developing solid-state transformers, which could modernize the grid, integrate renewable energy more efficiently, and react faster to instabilities, as per the report. However, initial deployments are at least a year or two away, and scaling production will take time, as per TechCrunch. ALSO READ: JPMorgan warns AI boom needs $650 billion a year, just for 10% return - bubble brewing? Renewable energy is expected to power most new data centers by 2035, the IEA said. Solar energy has become particularly attractive due to its falling costs, while natural gas and small modular nuclear plants will also play significant roles, as per the TechCrunch report. Over the next decade, roughly 400 terawatt-hours of electricity for data centers will come from renewables, 220 terawatt-hours from natural gas, and 190 terawatt-hours from nuclear, if small modular reactors deliver on their promise, as per the report. Why is spending on data centers higher than oil now? Because AI and digital services are driving massive infrastructure growth, requiring more investment than traditional energy, as per the TechCrunch report. How much energy will AI data centers use by 2030? AI data centers' electricity consumption is expected to grow fivefold by the end of the decade, as per the TechCrunch report. (You can now subscribe to our Economic Times WhatsApp channel)
[13]
AI gold rush comes with a catch: America could run out of power by 2028 as data centers drain the grid, Morgan Stanley says
AI power shortage warning: America may soon run out of electricity as AI data centers drain the grid, Morgan Stanley warns. The bank projects a massive 20% U.S. power shortfall through 2028 -- about 44 gigawatts, enough to power 33 million homes. Surging AI demand from Microsoft, Google, and Amazon is pushing the grid to its limits. New data centers, nuclear plans, and gas turbines are racing against time. The AI boom is real, but without power, the future could go dark fast. America's AI gold rush could soon hit a major energy wall. Morgan Stanley has warned that the United States could face a power shortage of up to 20% by 2028 as artificial intelligence data centers consume massive amounts of electricity. The bank estimates a potential deficit of 13 to 44 gigawatts, equal to the energy use of over 33 million U.S. homes, if new capacity isn't added quickly. Analysts led by Stephen Byrd said AI demand is rising at a "non-linear rate," calling it the biggest technological shift in modern history. But that speed is overwhelming the national grid. The surge is driven by tech giants like Microsoft, Google, Amazon, and Meta, who plan to spend nearly $400 billion in 2025 to expand AI computing capacity. But while chips are plentiful, power isn't. Microsoft CEO Satya Nadella admitted the biggest issue now is "not compute, but power," warning that some AI chips could end up sitting idle without electricity. Data centers take two years to build, but transmission lines take up to ten years. That mismatch is pushing the U.S. grid to its limit. In Virginia, home to the world's largest data hub, Dominion Energy said its order book jumped from 40 to 47 gigawatts in a year -- the same as 47 nuclear reactors. Studies show data centers already use 4% of U.S. electricity, and that could reach 12% by 2030. To meet demand, utilities are delaying coal plant closures and adding more natural gas -- which powers 40% of global data centers -- while some states like Georgia seek approval for 10 GW of new gas turbines. Morgan Stanley says "time-to-power" fixes like Bloom Energy fuel cells, natural gas turbines, and even nuclear-powered AI centers could ease the crunch. Amazon and Google are exploring small modular reactors, while the U.S. government plans ten new nuclear plants by 2030. Texas is racing ahead too, with 100 gigawatts of solar and battery projects expected by the end of the decade. But experts warn that even these fixes may not come fast enough. Some fear the U.S. could lose its edge in the AI arms race if power runs out. As one analyst put it, America's future in artificial intelligence may now depend less on chips -- and more on keeping the lights on. The stakes are high. AI promises massive economic growth and technological breakthroughs. But without enough electricity, the U.S. risks slowing its competitive edge. The AI gold rush isn't just about compute power; it's about ensuring the power to compute. Every day counts. Investors, policymakers, and businesses must act now to prevent a bottleneck that could reshape the AI landscape. The next few years will determine whether the U.S. leads the AI revolution -- or faces a power grid crunch that slows it down. The AI revolution is moving faster than ever. Big tech companies and startups alike are racing to deploy massive AI-powered data centres. Morgan Stanley analysts have warned that America could run short of electricity by as much as 20% through 2028, driven by a surge in power-hungry data centers. In a note this week, analysts led by Stephen Byrd projected a 44-gigawatt shortfall -- roughly the equivalent of power needed for 33 million U.S. homes -- unless new capacity comes online fast. The bank described AI computing as "the most important technological shift in modern history," but said it comes with a steep energy cost. Even with alternative "time-to-power" solutions, such as gas turbines or fuel cells, Morgan Stanley expects a 13-20 GW deficit to persist. The report highlights how AI's rapid expansion now risks outpacing America's electric grid capacity. Major tech firms -- Microsoft, Google, Amazon Web Services, and Meta -- are spending nearly $400 billion in 2025 to expand AI infrastructure, with even bigger spending planned for 2026. However, their biggest challenge is no longer chips, but electric power. "The biggest issue we are now having is not a compute glut, but the power and the ability to get the builds done fast enough close to power," Microsoft CEO Satya Nadella said in a recent podcast with OpenAI's Sam Altman. New data centers take about two years to build, while new high-voltage transmission lines can take up to a decade to complete. That delay has created what experts are calling an "energy wall" -- the point where even wealthy tech firms cannot expand without reliable electricity.Data centres are now among the most electricity-intensive facilities in the country. Each facility requires a constant, reliable supply to run AI models, cooling systems, and backup generators. With thousands of servers running 24/7, the demand is skyrocketing. In the U.S., data centers currently consume about 4% of total electricity, but that figure could rise to 7-12% by 2030, according to various studies cited by Morgan Stanley. In Virginia, the world's largest cloud computing hub, Dominion Energy reported that its data center power orders have climbed from 40 GW to 47 GW in just one year -- equivalent to 47 nuclear reactors. Experts warn this expansion may already be straining local grids and increasing consumer bills. Some utilities have even delayed plans to retire coal plants to meet demand, while natural gas -- which powers about 40% of data centers globally -- is making a comeback for its quick deployment capability. The growth isn't evenly distributed either. Certain regions, particularly in the West and Southeast, already operate near full capacity. Adding massive data centres in these regions could stress transmission lines and local substations. Utilities face the challenge of upgrading infrastructure, which is expensive and time-consuming. Without proper planning, the expansion of AI infrastructure may outpace the grid's ability to supply energy efficiently. To close the gap, Morgan Stanley suggests the U.S. may turn to natural gas turbines (adding 15-20 GW), fuel cells (5-8 GW), and nuclear-powered data centers (5-15 GW). Companies like Amazon and Google are already exploring Small Modular Reactors (SMRs) -- compact nuclear plants that could be deployed faster than traditional reactors. Google also plans to restart a nuclear reactor in Iowa by 2029. The Trump administration recently announced an $80 billion investment to build ten new reactors by 2030. Meanwhile, utilities in Georgia have requested approval to install 10 GW of new gas generators to feed AI and cloud expansion. Renewable energy is part of the plan too. In Texas, grid operators expect 100 GW of new solar and battery capacity by 2030, offering a potential buffer to the AI-driven power boom. Furthermore, the pressure on the grid isn't just about raw electricity. Cooling requirements, backup power systems, and peak-hour spikes all add to the complexity. Even if total generation is sufficient, localized overloads could lead to delays or higher costs for developers. In the race to launch AI systems, the speed at which a data centre can get connected to reliable electricity -- known as "time to power" -- is now a major competitive factor. Companies are willing to pay a premium for sites that already have strong grid access. Those that cannot secure power quickly risk delayed launches and lost business opportunities. Building new electricity infrastructure takes years. Permitting, construction, and connection to the grid are long processes. If AI companies expand faster than utilities can respond, projects may stall. Even when new power sources are planned, transmission lines and substations must be upgraded to handle the load, which further slows the process. The shortage is not hypothetical. The next few years will be crucial. By 2028, the expected shortfall could become critical, affecting not just AI data centres, but also the reliability of the grid for other industries and consumers. The implications of a strained grid go beyond the tech industry. Businesses could face higher electricity costs and limited site options for new data centres. Utilities might need to prioritize certain regions or projects, leading to uneven access to power. Regulators could be pressured to accelerate permitting and invest in infrastructure upgrades to prevent bottlenecks. Consumers may notice indirect effects as well. Electricity rates could rise in regions where demand exceeds supply. There could also be delays in the rollout of AI-powered services, affecting everything from cloud computing to AI-driven healthcare and financial tools. Investors may find opportunities in companies providing grid upgrades, transmission systems, cooling technologies, and alternative energy solutions. At the same time, there is risk for firms that rely heavily on consistent, affordable electricity but cannot secure it. As the AI arms race intensifies, the U.S. faces a paradox: the quest to build smarter systems could soon be limited by something as basic as electricity. Some companies are considering radical options -- from space-based solar power to satellite data centers. Elon Musk's Starlink and Google have proposed orbiting AI chips powered by solar energy, with Google's first tests set for 2027. But in the short term, the focus remains on securing terrestrial power. Morgan Stanley says firms converting Bitcoin mining facilities into AI compute centers could help bridge gaps. Two emerging business models -- the "new neocloud" (short-term AI leases like IREN's 5-year deal with Microsoft) and the "REIT endgame" (long-term power shell leases like APLD's 15-year agreement) -- are expected to shape the future of AI infrastructure. As analysts put it, "AI infrastructure stocks are at the center of this transition," and power will determine who wins the next phase of the technology race. (You can now subscribe to our Economic Times WhatsApp channel)
[14]
AI's Growth Faces a Low-Tech Challenge | Investing.com UK
With lofty valuations in AI-related stocks, it's understandable that investors are on edge about anything that might go wrong. Recently, concerns have centered on the ability to bring new AI data centers online at the volume AI providers require. While funding for construction is flowing, a lack of electricity and permitting -- two very low-tech problems -- might be the Achilles' heel of this otherwise high-tech industry. Demand for AI is growing unabated. On Tuesday, AMD (NASDAQ:AMD) CEO Lisa Su told investors that the company's AI data center revenue should grow by about 80% per year over the next three to five years. As a result, AMD's total revenue is forecast to increase by roughly 35% annually over the same period, which exceeded analysts' estimates. The company's stock price has been soaring (chart). But AI data center provider CoreWeave (NASDAQ:CRWV) threw cold water on any excitement after warning that its Q4 results would miss expectations because a developer failed to deliver data centers on time. Infrastructure that was expected to come online in Q4 will instead come online in Q1-2026 and Q2-2026. The delay didn't cost the company any customers, and demand for space in its data centers is still insatiable, CEO Mike Intrator told CNBC. But the company did reduce its 2025 revenue forecast range to $5.05 billion to $5.15 billion, below analysts' forecast of $5.29 billion. Microsoft (NASDAQ:MSFT) CEO Satya Nadella also recently raised concerns about construction delays. "The biggest issue we are now having is ... the ability to get the [data center] builds done fast enough, close to power. So if you can't do that, you may actually have a bunch of chips sitting in inventory that I can't plug in. In fact, that's my problem today. It's not a supply issue of chips. It's actually the fact that I don't have warm shells to plug into." A "warm shell" is a new data center ready for occupancy. We wonder if this is the beginning of many problems the industry will face trying to build and deliver data centers on time. Here's a look at some of the numbers surrounding AI data center demand, supply, and electricity: Even before AI hit the scene, the number of data centers in the US had been on the rise as more people and businesses increasingly relied on cloud computing to process, analyze, and store their ever-increasing data. The advent of AI pushed that trend into hyperdrive. Monthly spending on data center construction starts rose to a record $4.2 billion in August, based on a 12-month moving average. That's up 100% from the August 2024 level and 400% from August 2023, ConstructConnect reports. Spending ytd through August at $40 billion had already surpassed the full-year 2024 record (chart). Building has also gotten more expensive, with the average cost per square foot rising to $977 in August, up from $665 a year prior. The US leads the world in data centers, with 4,189 to date -- far exceeding the number in any other country, according to the Data Center Map. The UK has 511 data centers, followed by Germany (487), China (381), France (321), Canada (294), India (276), Australia (275), Japan (247), Italy (209), and others had even fewer -- those tracked range from hyperscale data centers to edge data centers. US data centers consumed 183 terawatt-hours (TWh) of electricity in 2024, according to IEA estimates cited in a Pew Research Center October 24 report. That works out to more than 4% of the country's total electricity consumption last year. By 2030, consumption is projected to grow by 133% to 426 TWh. A third of data centers are in Virginia, Texas, and California, Pew reports. Because data centers are often geographically concentrated, they can tax electric grids. For example, about 26% of the total electric supply in Virginia is consumed by data centers. There are growing concerns that data center demands are driving up electricity prices today and will continue to do so in the future. In recent years, electricity rates have increased as utilities have replaced aging equipment to protect against extreme weather events and cyberattacks. The typical US household was billed $142 a month for electricity last year, up 25% from $114 a month in 2014 (chart). Consumers worried about their electric bills are unlikely to support the construction of new data centers in their communities. Access to electricity is also causing headaches for those building data centers. Utility connection delays of up to five years are the most significant obstacle for data center growth, reports Bain & Company. Currently, data centers receive their energy from traditional electricity sources. Natural gas-powered utilities supplied more than 40% of the electricity used by data centers in 2024. Renewables, like wind and solar power, supplied 24% of the electricity, nuclear about 20%, and coal around 15%, Pew reported. Going forward, data center companies are looking at traditional sources as well as new ones to augment power supplies. Google Research is exploring space via its Project Suncatcher. The moonshot project is described as placing a constellation of solar-powered satellites carrying Google TPUs into low-earth orbit and connecting them via free-space optical links to create space-based AI infrastructure (chart). To turn this idea into a reality, Google (NASDAQ:GOOGL) will need to overcome several challenges, including establishing high-bandwidth communication between the satellites, managing orbital dynamics, and protecting equipment from radiation damage. Placed in the right orbit, a solar panel can be eight times more productive in space than it is on Earth, producing power nearly continuously and reducing the need for batteries. Google believes that by the mid-2030s, the cost of launching satellites should fall enough to make the cost of launching and operating a space-based data center roughly comparable to the energy costs of a data center on Earth. It plans to launch two prototype satellites by early 2027 to test how its models and TPU hardware operate in space and validate the use of optical intersatellite links for distributed machine learning tasks.
[15]
The AI revolution has a power problem
US tech giants face a new AI challenge: electricity shortages. Despite vast spending on chips and data centres, limited power and slow grid upgrades threaten progress. Firms are turning to coal, gas, and nuclear energy while softening climate pledges. Without enough power, America risks falling behind in the AI race. In the race for AI dominance, American tech giants have the money and the chips, but their ambitions have hit a new obstacle: electric power. "The biggest issue we are now having is not a compute glut, but it's the power and...the ability to get the builds done fast enough close to power," Microsoft CEO Satya Nadella acknowledged on a recent podcast with OpenAI chief Sam Altman. "So if you can't do that, you may actually have a bunch of chips sitting in inventory that I can't plug in," Nadella added. Echoing the 1990s dotcom frenzy to build internet infrastructure, today's tech giants are spending unprecedented sums to construct the silicon backbone of the revolution in artificial intelligence. Google, Microsoft, AWS (Amazon), and Meta (Facebook) are drawing on their massive cash reserves to spend roughly $400 billion in 2025 and even more in 2026 -- backed for now by enthusiastic investors. All this cash has helped alleviate one initial bottleneck: acquiring the millions of chips needed for the computing power race, and the tech giants are accelerating their in-house processor production as they seek to chase global leader Nvidia. These will go into the racks that fill the massive data centers -- which also consume enormous amounts of water for cooling. Building the massive information warehouses takes an average of two years in the United States; bringing new high-voltage power lines into service takes five to 10 years. Energy wall The "hyperscalers," as major tech companies are called in Silicon Valley, saw the energy wall coming. A year ago, Virginia's main utility provider, Dominion Energy, already had a data-center order book of 40 gigawatts -- equivalent to the output of 40 nuclear reactors. The capacity it must deploy in Virginia, the world's largest cloud computing hub, has since risen to 47 gigawatts, the company announced recently. Already blamed for inflating household electricity bills, data centers in the United Statescould account for 7 percent to 12 percent of national consumption by 2030, up from 4 percent today, according to various studies. But some experts say the projections could be overblown. "Both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use," Jonathan Koomey, a renowned expert from UC Berkeley, warned in September. As with the late 1990s internet bubble, "many data centers that are talked about and proposed and in some cases even announced will never get built." Emergency coal If the projected growth does materialize, it could create a 45-gigawatt shortage by 2028 -- equivalent to the consumption of 33 million American households, according to Morgan Stanley. Several US utilities have already delayed the closure of coal plants, despite coal being the most climate-polluting energy source. And natural gas, which powers 40 percent of data centers worldwide, according to the International Energy Agency, is experiencing renewed favor because it can be deployed quickly. In the US state of Georgia, where data centers are multiplying, one utility has requested authorization to install 10 gigawatts of gas-powered generators. Some providers, as well as Elon Musk's startup xAI, have rushed to purchase used turbines from abroad to build capability quickly. Even recycling aircraft turbines, an old niche solution, is gaining traction. "The real existential threat right now is not a degree of climate change. It's the fact that we could lose the AI arms race if we don't have enough power," Interior Secretary Doug Burgum argued in October. Nuclear, solar, and space? Tech giants are quietly downplaying their climate commitments. Google, for example, promised net-zero carbon emissions by 2030 but removed that pledge from its website in June. Instead, companies are promoting long-term projects. Amazon is championing a nuclear revival through Small Modular Reactors (SMRs), an as-yet experimental technology that would be easier to build than conventional reactors. Google plans to restart a reactor in Iowa in 2029. And the Trump administration announced in late October an $80 billion investment to begin construction on ten conventional reactors by 2030. Hyperscalers are also investing heavily in solar power and battery storage, particularly in California and Texas. The Texas grid operator plans to add approximately 100 gigawatts of capacity by 2030 from these technologies alone. Finally, both Elon Musk, through his Starlink program, and Google have proposed putting chips in orbit in space, powered by solar energy. Google plans to conduct tests in 2027.
[16]
Morgan Stanley sees up to 20% shortage of US power for data centers through 2028 By Investing.com
Investing.com -- Morgan Stanley analysts warned in a note this week that surging artificial intelligence demand could leave the United States facing a "power shortfall totaling as much as 20%" for data centers through 2028. The bank said the deficit could amount to roughly 13 gigawatts (GW) of capacity. "We project a U.S. power shortfall through 2028 of 44 gigawatts (GW), before considering innovative time-to-power solutions that do not rely on the typical grid interconnection process," analysts led by Stephen Byrd wrote. They added that "time to power solutions could surprise to the upside," potentially reducing the gap. Morgan Stanley attributed the potential shortfall to the rapid buildout of AI infrastructure, calling AI computing demand "the most important technological shift in modern history." The bank added that "AI infrastructure stocks are at the center" of that transition, with the "non-linear rate of AI improvement" creating broader asset valuation impacts. To meet demand, Byrd believes a range of "time to power" alternatives could come online, including "natural gas turbine transactions" that might add 15-20 GW of supply, 5-8 GW from Bloom Energy fuel cells, and 5-15 GW from nuclear-powered data center deals. The bank also highlighted a trend of Bitcoin miners converting existing facilities into high-performance computing centers. It identified two models, the "new neocloud," typified by IREN's five-year lease with Microsoft, and the "REIT endgame," involving long-term powered-shell leases such as APLD's 15-year deal with a hyperscaler. Morgan Stanley concluded that both models "offer compelling value creation," especially as power constraints become a defining challenge for AI expansion.
Share
Share
Copy Link
The global AI boom is driving unprecedented investment in data centers, with $580 billion spent this year alone—surpassing oil exploration funding. However, massive power demands are straining electrical grids worldwide, leaving some facilities idle while utilities scramble to upgrade infrastructure.
The artificial intelligence revolution is reshaping global investment patterns in unprecedented ways. According to a new International Energy Agency (IEA) report, the world will spend $580 billion on data centers this year—$40 billion more than will be allocated to finding new oil supplies
1
. This milestone represents what the IEA calls "a telling marker of the changing nature of modern, highly digitalized economies"3
.
Source: ET
The scale of planned investments is staggering. OpenAI has committed $1.4 trillion to building data centers over the next decade, while Meta has pledged $600 billion and Anthropic recently announced a $50 billion data center plan
1
. OpenAI CEO Sam Altman's internal memo reveals plans to build up to 250 gigawatts of compute capacity by 2033—equivalent to powering India's entire 1.5 billion population4
.Electricity consumption from AI data centers is projected to grow fivefold by the decade's end, doubling current total data center usage
3
. The United States will account for half of this demand growth, with Europe and China comprising most of the remainder. Most new facilities are being developed near major cities with populations exceeding one million, creating significant grid connection challenges3
.The infrastructure strain is already visible. In Silicon Valley's Santa Clara, two major data centers—Digital Realty's 48-megawatt SJC37 facility and Stack Infrastructure's equally powerful SVY02A campus—sit completely idle despite being construction-complete
5
. These facilities "may sit empty for years" while the local utility races to complete $450 million in grid upgrades scheduled for 2028.
Source: Seattle Times
Connection delays plague markets nationwide. Northern Virginia faces decade-long wait times for grid connections, while Dublin has paused new interconnection requests entirely until 2028
3
. Supply chain bottlenecks for cables, transformers, and critical minerals further complicate infrastructure expansion.While American companies lead AI innovation, the country lags significantly in energy infrastructure development. China installed 429 gigawatts of new power generation capacity in 2024—more than six times the net capacity added in the United States
2
. China's focus on solar, wind, nuclear, and gas installations contrasts sharply with America's emphasis on reviving coal plants, which operate at just 42% capacity compared to 61% in 20142
.This energy gap threatens America's AI leadership position. Already, China earns more from renewable energy exports than the US generates from oil and gas exports
2
. Without addressing power infrastructure challenges, the US risks becoming a consumer rather than innovator in both energy and AI technologies.Related Stories
Despite challenges, renewable energy presents significant opportunities. The IEA expects renewables to supply the majority of new data center power by 2035, with solar becoming particularly favored due to dramatic cost reductions
3
. Over the next decade, renewables will provide around 400 terawatt-hours of data center electricity, compared to 220 terawatt-hours from natural gas and 190 terawatt-hours from potential small modular nuclear plants.
Source: TechCrunch
Innovative companies are capitalizing on this shift. Redwood Materials launched Redwood Energy, creating microgrids from repurposed EV batteries specifically targeting AI data centers
1
. Solar installations adjacent to data centers face fewer regulatory hurdles than traditional projects, creating opportunities for renewable energy startups and innovative grid technologies like solid-state transformers being developed by companies such as Amperesand and Heron Power3
.Summarized by
Navi
[2]
1
Business and Economy

2
Technology

3
Policy and Regulation
