10 Sources
10 Sources
[1]
AI experts warn that China is miles ahead of the US in electricity generation -- lack of supply and infrastructure threatens the US's long-term AI plans
AI data centers demand gigawatts of power that the U.S. grid is struggling to supply. A U.S. analyst of Chinese technology said that the country has already solved its energy problem -- at least in terms of power for its AI infrastructure. Rui Ma, founder of Tech Buzz China, posted on X that the country's massive investments in advanced hydropower and nuclear technologies meant that its "electricity supply is secure and inexpensive." This is in contrast to the U.S., where many AI data centers are disrupting its electricity grid and supply, resulting in a lack of supply and price increases for every user. Both Washington and Beijing are currently in an AI race, with the two powers vying for the lead in this technology. Because of this, the two rivals are diving into a massive build-out of AI data centers that require massive amounts of electricity to run. In the U.S., it has come to the point that tech giants are building their own power plants -- with Elon Musk importing one to power his data centers and companies, like Microsoft, Google, Amazon, Oracle, Nvidia, and more investing in the research and development of nuclear reactors. However, it seems that this is not a problem for China. According to Fortune, the East Asian country has an 80% to 100% power reserve, allowing it to absorb the massive demand brought about by the hundreds of data centers it built in recent years. More than that, it's also continually expanding its output, with one expert telling the publication that it "adds more electricity demand than the entire annual consumption of Germany, every single year." Some argue that the power is delivered by heavily polluting coal plants, but China is also investing massively in renewable energy projects. Nevertheless, if the power demand outstrips supply, it can easily reactivate coal plants to cover the shortfall. In fact, the new data centers are welcomed, as they help stimulate demand in a market that has an excess of power production. Nevertheless, electricity oversupply doesn't seem to be an immediate concern, as most of China's power plants are state-owned. Beijing also plans its energy production well in advance, allowing it to prepare for prospective demand, like the AI data center boom. This still does not address the elephant in the room, though: the fact that many Chinese data centers sit idle or underutilized. Beijing is developing a network to create a marketplace that will sell surplus capacity, but it is still facing challenges, especially with latency and different ecosystems. On the other hand, the U.S. faces major hurdles with its electricity supply. Meta founder Mark Zuckerberg said that power constraints will limit AI growth, and that new power plants aren't being built fast enough to satisfy AI's insatiable demand. If the U.S. does not address this issue sooner, it risks lagging behind China even if it has more powerful and efficient hardware. That's because the latter can just throw tons of power to gain the upper hand in the AI race through sheer brute force, similar to how Huawei's CloudMatrix cluster beats the performance of Nvidia's GB200.
[2]
Big Tech, power grids take action to reign in surging demand
Industry Insight from Reuters Events, a part of Thomson Reuters. August 18 - As data centers become an increasing driver of U.S. power demand, operators and power companies are seeking ways to better integrate them into the power network. Power plant developers and network operators are scrambling to keep up with the demands of new data centers that need electricity day and night. The U.S. Department of Energy forecasts 20 GW of new data center load by 2030 and predicts data centers will consume 6.7%-12% of total U.S. power production by 2028, up from 4.4% in 2023. Grid operators were already overloaded with renewable energy applications and surging requests from new data centers projects are leading to connection delays and holding back growth. The pace at which utilities, developers, and state regulators can accelerate their processes for siting, permitting, and building new infrastructure will "very likely act as a constraint on data center growth in the near to medium term," consultancy Energy and Environmental Economics (E3) said in a report on grid strains in Virginia. As grid capacity dwindles, availability of power and grid infrastructure has become a critical driver of site selection for data center developers. Larger data center complexes are being developed with some surpassing 1 GW, equivalent to one large nuclear reactor. "We are having to make geographic choices based on the fact that that we need to move pretty quickly to deploy the infrastructure that we need," Bobby Hollis, VP of energy at Microsoft, told Reuters Events. "If one location is moving significantly slower than another because of interconnection or transmission line requirements, then we might have to go to another location." Join 3,000+ senior decision-makers across energy and finance at Reuters Events Energy LIVE 2025. Developers typically favor sites with competitive clean energy supply, transmission availability and low development costs, and development is soaring in Texas and Northern Virginia, where end-users can benefit from some of the lowest power prices in the United States. As AI demand soars, the European Union is implementing measures to increase the energy efficiency of data centers, but there is no such centralized push in the United States. U.S. utilities and regional network operators are implementing a patchwork of incentives and market mechanisms to minimize the impact of data center load. Meanwhile, data center operators are working on innovative technologies and systems to increase energy efficiency and be flexible to grid needs. Power market controls Grid operators are having to spend more on power infrastructure to meet the growing demands of AI, with knock-on effects for the wider public. Dominion Energy, Northern Virginia's primary utility, proposed in April a new rate class for high-load users including data centers to reduce the cost burden for residential customers. Dominion had 40 GW of data center capacity under contract by the end of 2024, including planned facilities, up from 21 GW six months prior. Dominion also proposed higher electricity rates for other customers to cover its costs and expand clean power capacity. CHART: Forecast US data center electricity demand PJM, the operator of the U.S.' largest network spanning much of eastern U.S. and parts of northern U.S., is using market tools like capacity auctions and demand response to manage growing load from hyperscale customers like data centers. In California, grid operator CAISO uses incentives like flexible load, demand response payments and time of use rates to ease the impact of large customers. In Texas, soaring demand for clean power and data centers has prompted grid operator ERCOT to introduce stricter regulation for large consumers. Large users must be able to ride through grid faults and quickly resume consumption while high price caps reward flexible consumption by offering revenue for data center operators able to cut load, shift load to other time periods, or use on-site generation, during peak demand periods. Download our exclusive report: Soaring US Power Demand Opens New Paths for Developers. With better data and AI, grid infrastructure could handle more load, Rich Voorberg, President of Siemens Energy North America, a power technology group, said. Greater access to data allows grid partners to use tools such as digital twins to assess whether existing infrastructure can be optimized to expand overall capacity. "In pockets we are already seeing it," Voorberg said. "We're consulting with different grids and...utilities on how to better optimize the grid." As grid capacity dwindles, some tech groups and data center developers are seeking to co-locate new power generation with data center sites to reduce grid connection delays and reduce exposure to regional markets and grid bottlenecks. Efficiency savings To ease grid pressure, data center operators are adopting more efficient, low-emission designs, which will have an impact on future power demand. Microsoft is investing in liquid and natural air cooling to reduce energy and water use, plus AI-driven tools to maximize computing efficiency. "We're dealing with constraints like the rest of the marketplace," Hollis said. CHART: Impact on data centers of new cooling technologies vs air cooling Amazon Web Services (AWS) is optimizing its data center mechanical systems and designing proprietary high-efficiency components. AWS focuses on performance metrics such as Power Usage Effectiveness (PUE) and water usage effectiveness. "To increase efficiency, AWS uses different cooling techniques, including free air cooling depending on the location and time of year, as well as real-time data to adapt to weather conditions," the company said. "AWS' latest data center design seamlessly integrates optimized air-cooling solutions alongside liquid cooling capabilities for the most powerful AI chipsets." Innovative cooling mechanisms are a major focus of energy reduction. Data center group Digital Reality is shifting from air cooling to liquid cooling through the use of a closed loop where the fluid is recirculated and supports higher density racks. "We're aggressively shifting toward liquid cooling," said Aaron Binkley, Digital Reality's VP of Sustainability. The company has a global portfolio of around 170 data centers and supports liquid cooling in more than half of its facilities, he said. For exclusive energy insights, sign up to our newsletter. Variations in load profiles between fluctuating AI and crypto consumption and the flatter load for cloud services also offer opportunities to manage and optimize power use. Data centers are sometimes seen as a resource for the grid, with the possibility to shift loads by pausing activities or shifting activity to off-peak hours. "The great thing about the AI data centers is they are inherently more flexible in terms of workload (...). It creates some opportunities to lessen the overall impact they have and thereby provide benefits back," said David Porter, EPRI's VP Electrification & Sustainable Energy Strategy. PUE (Power Usage Effectiveness) of data centers is currently the dominant industry metric, but operators are now evaluating performance based on how much useful computing -- such as data processing units known as AI tokens, or mathematical calculations known as floating point operations (FLOPS) -- is delivered per watt. Meanwhile, some data centers are exploring using direct current (DC) rather than alternating current (AC) to cut conversion losses. Swiss power technology group ABB created a demonstration site in Zurich operating entirely on DC which increased energy efficiency by 10% while reducing installation costs by 20% and lowering investment costs for electrical components by 15%. "This is a new trend... we are also speaking with our partners," said Massimiliano Cifalitti, president of ABB's Smart Power division. --Editing by Robin Sayles * Suggested Topics: * Energy Opinions expressed are those of the author. They do not reflect the views of Reuters News, which, under the Trust Principles, is committed to integrity, independence, and freedom from bias. Reuters Events, a part of Reuters Professional, is owned by Thomson Reuters and operates independently of Reuters News. Juliana Ennes Juliana is an experienced energy and environment journalist based in New York. Her articles have been published by Euromoney, Brazilian newspaper Valor Economico and the Mongabay news site, among others. She holds an MSc in Environmental and Energy Policy from New York University (NYU) and an MBA in Finance.
[3]
Data centers built for AI are reshaping America's energy future, while households risk paying for empty infrastructure and ballooning power demands
Electricity demand projections show data centers tripling U.S. consumption by 2028 The accelerating demand for computing power has pushed artificial intelligence into the center of the US energy debate. Data centers used to support cloud services, streaming platforms, and online storage already consume large amounts of electricity, but the rise of AI tools has magnified those needs. According to federal projections, the share of national electricity use from data centers could rise from 4% in 2023 to 12% by 2028. Since running an AI writer or hosting an LLM is more energy-intensive than typical web activity, the growth curve is steep. This expansion is not only changing the relationship between technology firms and utilities, but it is also reshaping how electricity costs are distributed across society. Electricity prices in the US have already climbed more than 30% since 2020, and a Carnegie Mellon-North Carolina State study warns of another 8% nationwide rise by 2030. In states such as Virginia, the increase could reach 25%. Utilities argue that grid upgrades are essential, but the concern is who will pay for them. This is only the beginning, because when a French person asks ChatGPT when the next strike is planned, Americans pay more for electricity. How? When anyone anywhere in the world asks ChatGPT an everyday question, the extra energy consumed by that query is absorbed into U.S. grid demand. This is because the ChatGPT system runs on US-based servers, hosted in American data centers and powered by the US electricity grid. If technology firms secure large capacity allocations and delay projects, households and small businesses may be left paying for unused infrastructure. The case of Unicorn Interests in Virginia, where a delayed facility left nearby customers covering millions in upgrade expenses, underscores this risk. To counter such problems, American Electric Power in Ohio proposed a rate plan requiring data centers to pay for 85% of the requested capacity regardless of actual use. The state's regulators approved the measure despite opposition from cloud service providers, who offered a 75% minimum instead. Some companies have sought to bypass traditional utilities by generating their own power. Amazon, Microsoft, Google, and Meta already operate renewable installations, gas turbines, and diesel backup generators, and some are planning nuclear facilities. These companies not only produce electricity for their own operations but also sell surplus energy into wholesale markets, creating competition with traditional suppliers. In recent years, such sales have generated billions, giving major cloud providers influence over both supply and price in certain regions. The volatile consumption patterns of AI training, which can swing sharply between peaks and lows, pose another challenge. Even a 10% shift in demand can destabilize networks, forcing utilities to intervene with dummy workloads. With households already paying more each month in some states, the concern is that consumers will end up covering the cost of keeping LLM hosting and AI writer systems online.
[4]
AI experts return from China stunned: The U.S. grid is so weak, the race may already be over
"Everywhere we went, people treated energy availability as a given," Rui Ma wrote on X after returning from a recent tour of China's AI hubs. For American AI researchers, that's almost unimaginable. In the U.S., surging AI demand is colliding with a fragile power grid, the kind of extreme bottleneck that Goldman Sachs warns could severely choke the industry's growth. In China, Ma continued, it's considered a "solved problem." Ma, a renowned expert in Chinese technology and founder of the media company Tech Buzz China, took her team on the road to get a firsthand look at the country's AI advancements. She told Fortune that while she isn't an energy export, she attended enough meetings and talked to enough insiders to come away with a conclusion that should send chills down the spine of Silicon Valley: in China, building enough power for data centers is no longer up for debate. "This is a stark contrast to the U.S., where AI growth is increasingly tied to debates over data center power consumption and grid limitations," she wrote on X. The stakes are difficult to overstate. Data center building is the foundation of AI advancement, and spending on new centers now displaces consumer spending in terms of impact to U.S. GDP -- that's concerning since consumer spending is generally two-thirds of the pie. McKinsey projects that between 2025 and 2030, companies worldwide will need to invest $6.7 trillion into new data center capacity to keep up with AI's strain. In a recent research note, Stifel Nicolaus warned of a looming correction to the S&P 500, since it forecasts this data-center capex boom to be a one-off build-out of infrastructure, while consumer spending is clearly on the wane. However, the clear limiting factor to the U.S.'s data center infrastructure development, according to a Deloitte industry survey, is stress on the power grid. Cities' power grids are so weak that some companies are just building their own power plants rather than relying on existing grids. The public is growing increasingly frustrated over increasing energy bills - in Ohio, the electricity bill for a typical household has increased at least $15 this summer from the data centers - while energy companies prepare for a sea-change of surging demand. Goldman Sachs frames the crisis simply: "AI's insatiable power demand is outpacing the grid's decade-long development cycles, creating a critical bottleneck." Meanwhile, David Fishman, a Chinese electricity expert who has spent years tracking their energy development, told Fortune that in China, electricity isn't even a question. On average, China adds more electricity demand than the entire annual consumption of Germany, every single year. Whole rural provinces are blanketed in rooftop solar, with one province matching the entirety of India's electricity supply. "U.S. policymakers should be hoping China stays a competitor and not an aggressor," Fishman said. "Because right now they can't compete effectively on the energy infrastructure front." China's quiet electricity dominance, Fishman explained, is the result of decades of deliberate overbuilding and investment in every layer of the power sector, from generation to transmission to next-generation nuclear. The country's reserve margin has never dipped below 80%-100% nationwide, meaning it has consistently maintained at least twice the capacity it needs, Fishman said. They have so much available space that instead of seeing AI data centers as a threat to grid stability, China treats them as a convenient way to "soak up oversupply," he added. That level of cushion is unthinkable in the United States, where regional grids typically operate with a 15% reserve margin and sometimes less, particularly during extreme weather, Fishman said. In places like California or Texas, officials often issue warnings about red-flag conditions when demand is projected to strain the system. This leaves little room to absorb the rapid load increases AI infrastructure requires, Fishman ntoed. The gap in readiness is stark: while the U.S. is already experiencing political and economic fights over whether the grid can keep up, China is operating from a position of abundance. Even if AI demand in China grows so quickly renewable projects can't keep pace, Fishman said, the country can tap idle coal plants to bridge the gap while building more sustainable sources. "It's not preferable," he admitted, "but it's doable." By contrast, the U.S. would have to scramble to bring on new generation capacity, often facing years-long permitting delays, local opposition, and fragmented market rules, he said. Underpinning the hardware advantage is a difference in governance. In China, energy planning is coordinated by long-term, technocratic policy that defines the market's rules before investments are made, Fishman said. This model ensures infrastructure buildout happens in anticipation of demand, not in reaction to it. "They're set up to hit grand slams," Fishman noted. "The U.S., at best, can get on base." In the U.S., large-scale infrastructure projects depend heavily on private investment, but most investors expect a return within three to five years: far too short for power projects that can take a decade to build and pay off. "Capital is really biased toward shorter-term returns," he said, noting Silicon Valley has funneled billions into "the nth iteration of software-as-a-service" while energy projects fight for funding. In China, by contrast, the state directs money toward strategic sectors in advance of demand, accepting not every project will succeed but ensuring the capacity is in place when it's needed. Without public financing to de-risk long-term bets, he argued, the U.S. political and economic system is simply not set up to build the grid of the future. Cultural attitudes reinforce this approach. In China, renewables are framed as a cornerstone of the economy because they make sense economically and strategically, not because they carry moral weight. Coal use isn't cast as a sign of villainy, as it would be among some circles in the U.S. - it's simply seen as outdated. This pragmatic framing, Fishman argued, allows policymakers to focus on efficiency and results rather than political battles. For Fishman, the takeaway is blunt. Without a dramatic shift in how the U.S. builds and funds its energy infrastructure, China's lead will only widen. "The gap in capability is only going to continue to become more obvious -- and grow in the coming years," he said.
[5]
AI data centers face massive US power grid shortage
Data centers are growing faster and bigger than ever as artificial intelligence drives demand for computing facilities that can consume as much power as entire cities, but America's electrical grid is struggling to keep pace with the breakneck expansion. Power constraints have emerged as the single biggest bottleneck to building our AI future, according to a new report by JLL that found data center vacancy rates have plummeted to a record low 2.3%. The surge in demand has created an unprecedented mismatch between what companies need and what's actually available. The scale of demand has exploded beyond what the industry has ever seen. Just a few years ago, most data centers needed 200 to 300 megawatts on roughly 300 acres. Now, Andy Cvengros, the executive managing director and co-lead of JLL's US data center markets team, said hyperscalers are requesting sites with 1,000 acres and multiple gigawatts of power. "The amount of power being requested is like all of New York's power in one single site," Cvengros said. The surge is creating chaos in utility planning, with speculative developers flooding the system with requests they hope to flip for profit. In Chicago alone, utilities are seeing 40 gigawatts of power requests, roughly 40 times the city's entire existing data center capacity. Cvengros estimates that 90% of those requests aren't real, but the flood of applications is overwhelming utility systems and creating years-long backlogs for legitimate projects. Where real demand does exist, it's heavily concentrated. The report shows that 50% of new demand in the first half of 2025 was concentrated in just two markets: Northern Virginia and Dallas. With so little space available, companies are now forced to reserve capacity in data centers that haven't even been built yet, sometimes waiting over a year for construction to finish before they can actually use the space. The financial stakes are enormous. JLL estimates that North America could see $1 trillion in data center development between 2025 and 2030, with more than 100 gigawatts of capacity potentially breaking ground. The construction pipeline of 8 gigawatts is already 73% preleased, a rate that JLL says signals that vacancy will remain restrictive for years. The constraints come as the industry grapples with questions about whether the AI buildout is sustainable. OpenAI CEO Sam Altman recently admitted that we're in the midst of an AI bubble, though he still plans to spend "trillions" on data center construction. Meanwhile, Morgan Stanley estimates the industry needs $3 trillion in global investment by 2028, raising concerns about whether the massive spending can generate returns before the infrastructure becomes obsolete. But the power crunch is driving up costs and forcing companies to get creative. U.S. commercial electricity rates have increased nearly 30% over the last five years as utilities address aging infrastructure and record demand. In response, 75% of development activity is now concentrated in low-cost electricity markets. The report also shows companies are increasingly turning to alternative energy solutions, including natural gas turbines on-site and partnerships with fuel cell companies. Critical equipment like transformers now have four-to-five-year lead times, creating another bottleneck in an already constrained system. Meanwhile, the demand side keeps growing. Traditional enterprise customers like banks, trading firms, and healthcare companies are also launching their own AI programs, adding pressure to the grid and data center market in every direction. "You're squeezing the toothpaste from both ends," Cvengros said.
[6]
The US May Have Already Lost the AI Race to China Due to a Key Weakness
You might have heard of an "AI race" heating up between the US and China, a bitter rivalry between two global adversaries that could shape the direction of world history. At least, that's how some in the US feel. While China has repeatedly tried to establish a geopolitical friendship with the richest nation in the world, officials and pundits in the US have doubled down, reframing artificial intelligence as the 21st century's nuclear bomb. In the meantime, China may have gotten a massive lead -- by actively investing in its power grid, while the United States' is quickly running out of capacity to power immensely power-hungry AI models. As Fortune reports, Americans who've had a look at China's technological development firsthand found that the two country's aren't even in the same league, given China's next-level power grid. "Energy is considered a solved problem," wrote Rui Ma, editor of the US publication Tech Buzz China. "Everywhere we went, people treated energy availability as a given," she continued. "This is a stark contrast to the US, where AI growth is increasingly tied to debates over data center power consumption and grid limitations." AI is a notoriously energy-intensive technology. The data centers powering large language models like ChatGPT are immense labyrinths of computer chips, which suck down resources like power and water in order to keep up with demand. As Fortune notes, this effectively makes electricity the key bottleneck for expanding AI infrastructure. That's caused some critical shortages in the US. Short on energy and hopped up on fantasies of an arms race, American companies are resorting to all kinds of bizarre strategies to get their juice. Elon Musk's xAI, for example, is running 35 portable methane gas generators in the parking lot of one of its main datacenters in Memphis, encircling nearby communities in a cloud of noxious smog. China has no such problems. In 2024, China was responsible for nearly 65 percent of the world's renewable energy construction, installing so many solar panels and wind turbines that it caused the country's CO2 emissions to drop for the first time -- despite record-high demands for energy. Whether or not the US can catch up remains to be seen. President Donald Trump previously made an off-the-cuff remark about attaching coal power plants to data centers directly. It's an unfortunate conundrum in an age when energy demand in the US has never been higher. In the meantime, China keeps chugging along, seemingly unperturbed by any energy bottlenecks -- and the Trump administration's posturing.
[7]
Analysts warn U.S. power limits risk ceding AI race to China
As America's power grid creaks under the strain of breakneck expansion and electricity demand from AI data centers, China is facing a very different situation, experts say -- and it could have big implications for the global tech race. Rui Ma, a U.S. expert on Chinese technology and founder of media outlet Tech Buzz China, posted on X that energy is considered a "solved problem". Ma, who recently took colleagues to Shanghai to talk to businesses about AI developments in the country, said insiders told her China does not face the same capacity issues as the U.S. because of its supply. "The Chinese government's investment in sustainable energy -- from advanced hydropower to next-generation nuclear -- means that, relative to many other markets, electricity supply is secure and inexpensive," she wrote. "Everywhere we went, people treated energy availability as a given." China is by far the world's biggest power producer. The International Energy Agency reports it generated nearly 9,000 TWh of electricity in 2022 -- about double U.S. output and more than 30% of the global total. "This is a stark contrast to the U.S., where AI growth is increasingly tied to debates over data center power consumption and grid limitations," Ma wrote on X. In the U.S., power constraints have become the single biggest bottleneck to the AI buildout, according to JLL, with data center vacancy rates dropping to a record low of 2.3%. The surge in demand has created an unprecedented mismatch between what companies need and what the grid can deliver. So great is the crunch in the U.S. that developers are increasingly bypassing utilities to build their own power plants, while residents face rising bills, a Deloitte survey showed. In Ohio, for example, average household costs climbed by at least $15 this summer due to data center demand. The constraints come as the industry grapples with questions about whether the AI buildout is sustainable. OpenAI CEO Sam Altman recently admitted that the industry is in the midst of an AI bubble, though he still plans to spend "trillions" on data center construction. Meanwhile, Morgan Stanley estimates the industry needs $3 trillion in global investment by 2028, raising concerns about whether the massive spending can generate returns before the infrastructure becomes obsolete. In China, meanwhile, government-led investment has created what experts call an electricity surplus. David Fishman, a China energy analyst, told Fortune that the country maintains reserve margins of 80%-100%, compared to roughly 15% in the U.S.. China's advantage stems from centralized, long-term planning, Fishman told the news outlet. State-directed investment makes sure infrastructure is built ahead of demand, even if projects lose money in the short term. In the U.S., by contrast, private capital drives energy projects, but investors typically demand returns within five years -- far shorter than the decade it can take for grid projects to come online. "U.S. policymakers should be hoping China stays a competitor and not an aggressor," Fishman said. "Because right now they can't compete effectively on the energy infrastructure front."
[8]
From data boom to energy bust - tech's role in the global power challenge
AI and generative AI have dramatically redefined the boundaries of data growth and ownership and by doing so have created a seismic shift in the tech industry. Whether the industry is discussing Moore's Law or Jevons Paradox, when it comes to data growth, demand is increasing. FOMO is driving decisions around AI, LLMs are ubiquitous and available to all through search engines and app stores, and the only trend is -- more. Greater data center capacity requires increasing power, yet energy availability is shrinking. As more homes are built, more businesses come online, and more people use AI and gen AI, the power and data storage needs of both will grow. This creates a square peg/round hole situation for both the tech industry, who are trying to meet customer expectations, and governments, who need to secure the energy supply for homes and businesses. How can both get what they need to satisfy these requirements? Let's take a step back and examine where energy comes from and how it supplies data centers. There are three sources of power -- renewables including solar, wind, and hydro, nuclear, and fossil fuels. Power is generated, it goes to the grid for distribution and then to substations and onto supplying homes and businesses. Getting power from the generation source to its end point is a very complex and expensive process. Power grids across Europe are 30-80 years old and designed to supply 20th century size loads. Grids are a complex coalition of technology. Today the grids can be anything up to 1000-1500 miles in length, depending where the actual source is. They are simply not built for the workloads demanded today, and increasingly these grids are linked across international borders where countries seek to buy and sell power as local demands fluctuate. Around 2% of global energy generation is consumed by data centers and transmission networks according to the IEA. US DOE reports that data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7 to 12% of total U.S. electricity by 2028. In terms of specific workloads, McKinsey & Co are forecasting a 39% Compound Annual Growth Rate (CAGR) increase in gen AI workload demands and 16% CAGR on other workloads for global data center capacity demand by 2030. A recently announced hyperscaler data center is almost the size of Manhattan. This puts pressure on national grids. Governments need to ensure consistent power for certain areas. The UK government has recently designated data centers as Critical National Infrastructure as a protection to ensure they are kept online in the event of an attack or malicious activity. Within the data center, power is used for compute, storage, networking and cooling. Historically, water cooling was widely used, but many operators have stopped, and are now using more traditional chillers and cooling strategies, driven by electricity, which means less power is going to the computer load and it increases the power demand even further. In terms of AI, the latest model of GPU represents the daily energy consumption of a "standard" 4-person home at ~30kWh. GPU manufacturers are shipping hundreds of thousands of GPUs every quarter. This demand for AI is taking electricity away from other sources. Put simply -- the industry is behind the curve of demand. One might think the answer is to generate more power. However, it's not a simple calculation -- adding more electricity doesn't necessarily mean more data center capacity and keeping homes and businesses powered. As mentioned, the grid is old, there's a last mile problem as substations were built for old workloads, they can't handle more power being pushed through them. Some organizations are investing in super conductor carbon fiber cables instead of steel, but there's a long way to go before these solutions will be in use. Already, there isn't enough power to go around to meet collective demand and there are well documented cases of this in West London and Dublin. For countries who are investing heavily in building data centers, the amount of electricity they require isn't sustainable without change. To increase generation, key considerations include: While individuals need to take responsibility for their own energy use, business and government have a vital role to play. This includes controlling usage, legislating for change and investing in long term solutions. We must also regard the broader considerations below. The whole IT industry has had a pass in recent years. That must change. Tech leaders need to embrace and champion different ways of doing things, get in front of governments and policy makers to redesign, revise and remodel for more energy efficient technology. Some considerations: Pure Storage launched its first product, a 5TB system more than 10 years ago. Since then, the single system capability has increased by a factor of 1,200+ to 6PB. The product is now physically smaller and requires less power despite (currently) being 1,200x larger capacity. If cars had improved at the same rate since 2013, today we would be able to drive around the earth in around ten minutes on a single tank of fuel. There needs to be massive change in these industries -- electricity, data centers, networking and the grid. They're all connected and all need to be updated. The data boom may have been the catalyst that highlights the need to upgrade. But it will not stop or slow down. Consumer and business demand has reached such a pitch, that it's obvious this industry is only growing. The only way to meet these needs is to invest in innovation, embrace change and be sensible with the resources available.
[9]
The U.S. AI Datacenter Boom Is So Massive That the Nation's Power Grid Is In a "Critical Bottleneck" Right Now; Big Tech is Forced to Build Its Power Plants
While the AI hype is advancing with full force, the industry seems set to face a massive issue due to America's old power grids, which aren't capable of handling data center expansion. Big Tech firms like Meta, Amazon, and Google are rapidly expanding their AI CapEX to fuel their computing needs and achieve superiority in producing the best AI models. However, in this pursuit of 'AI perfection,' there's a lot at stake, and one of the more worrying aspects is how massively the need for energy is growing in the US. And, when you factor in power grids that were created back in the 70s, you'll realize that America needs to think quick about its energy needs, or else the AI hype could prove to be devastating for them. Based on a report by Goldman Sachs, it is claimed that datacenter energy requirements are outpacing the US grid upgrade cycles, to the point that it is now approaching a 'critical bottleneck', and if the current administration doesn't come up with a viable action plan, America could see a power problem pretty soon. It is revealed that a single interconnection request scales up to a whopping 5 gigawatts each, which accounts for the power usage of over five million homes. When you factor in that Big Tech sees GW-centres as the new norm, the energy demand is expected to grow tremendously. Gartner predicts that more than 40% of US datacenters might not have power to fully function, as the collective power demand could reach up to 500 Terawatt-hours by 2027. The massive electricity demand has also resulted in power prices rising noticeably, and according to Axios, the monthly electric bills in high-datacenter regions could increase $14-$37 by 2040, not factoring in inflation. Firms like Google and Amazon cannot stop with their AI expansion, or else they will lag behind the race, and the only way they can ensure further expansion in America is if they build their power sources. This is already happening, and some of the more prominent examples include Meta building natural-gas-powered plants for their 'multi-GW' AI clusters, and Microsoft reviving the Three Mile Island nuclear plant project for its supercomputers. Tech companies are moving into the energy business to replace the incompetent US electricity system. This might ease the pressure on the US grid in the longer term, but for now, concerns remain as setting up power plants is a multi-year project, while datacenter expansion is happening rapidly. For now, America needs an effective solution, and interestingly, President Trump's 'AI action plan' actually addresses the concerns and is coming up with a viable solution to uplift the US grid system.
[10]
Why China's Energy Strategy Is Giving It The Edge In The AI Race
Enter your email to get Benzinga's ultimate morning update: The PreMarket Activity Newsletter Given its voracious appetite for energy, it's safe to say that AI innovation is not bottlenecked by silicon; rather, it's gridlocked by power. Two of the biggest players in the sector appear to be on opposite sides of the spectrum regarding the issue. For a long time, the United States has remained entangled in fragmented energy regulation and a power grid increasingly incapable of supporting industrial-scale AI. Meanwhile, China is building the one thing AI truly needs to scale: energy infrastructure. The country has effectively tripled its renewable energy installations, transforming its grid at a speed unmatched anywhere else. In 2022 alone, data from Statista shows it added roughly as much solar capacity as the rest of the world combined. According to the International Energy Agency (IEA), in 2024, China invested more than $625 billion in clean energy. By the end of July that year, its installed wind and solar power capacity had reached 1.206 billion kilowatts, meeting a target it had set for 2030, six years ahead of schedule. At this pace, the economic powerhouse is on track to reach 1,000 GW just from solar power by 2026. Additionally, the IEA projects that the country will spend $88 billion on grid and storage investments in 2025, plus an additional $54 billion to improve coal generation. This shows the groundwork China is laying for a long-term advantage in AI deployment. The U.S., by contrast, is already hitting the grid ceiling. Aging power grid holding back the U.S. However, with concern rising and China pulling ahead, U.S. President Donald Trump has jumped into the conversation. In a major policy move, the head of state signed a "sweeping executive order" to fast-track federal permitting, streamline reviews, and hurry up the construction of all major AI infrastructure projects, including factories, data centers, and power plants. "You're gonna go so fast, you're gonna say, 'wait a minute, this is too fast. I didn't expect it to go this quickly," Trump said, adding that the initiative would bring in "tens of billions of dollars" and ensure the U.S. enjoys "total industrial technological supremacy." This is the clearest sign yet that Washington is finally recognizing that the AI race is not just about smarter models but about who can build faster and scale harder. For a long time, the narrative around artificial intelligence was that supremacy could be achieved primarily through model architecture and advanced GPUs. But this has played out. Compute capacity, the ability to train, fine-tune, and deploy large models at scale, will be key in determining who dominates the sector. However, it relies on access to vast amounts of stable energy, and on this front, the U.S. is woefully underprepared. It's no secret that the country's power grid is plagued by bottlenecks and aging transmission lines, and is facing heavy strain from electrification trends and the proliferation of data centers. Reports indicate that utilities are already overwhelmed by demand from AI-linked facilities. Some locations are even facing multi-year delays for grid access. Things have gotten bad enough that America's largest data center markets, including Virginia and Texas, are now imposing moratoriums or rationing megawatt allocations. Open-source momentum is shifting east Since 2017, Beijing has outlined a national roadmap to lead the world in AI. Startups like DeepSeek and Eshallgo are examples of how this strategy is being operationalized. They have favored lean, fast-deploying models over huge, resource-intensive training runs. This shift reflects a broader ideological divergence: The U.S. is focused on optimizing for closed-source, centralized models with huge capital expenditures, while China is prioritizing efficiency and deployment over perfection. Interestingly, even this centralized Chinese model is beginning to fragment, with open-source developers gaining steam. As American tech entrepreneur Balaji Srinivasan said: "AI is decentralizing to Asia, too. Manus, DeepSeek, Qwen, Kimi. Interestingly, they are also decentralizing out of China. By going open source, and by physically moving out of China." Case in point: in just two weeks since the release of Kimi K2, Alibaba's Qwen3-Coder has already surpassed it, despite being half the size and featuring double the context window. This shows that open-source development is rapidly reaching escape velocity, and deployment models are getting leaner and faster. In enterprise environments, speed of integration beats theoretical model superiority. China understands this. Its AI push isn't about building the "best" foundation model; it's about embedding intelligence into economic infrastructure right now. On the other hand, U.S. policymakers are still fixated on semiconductor choke points. While export bans on Nvidia and AMD chips might temporarily delay Chinese training runs, they do nothing to address the U.S.'s domestic infrastructure shortcomings. If anything, such bans will only force China to speed up investments in self-sufficiency, pushing companies toward custom chips optimized for specific use cases and energy-efficient deployment. There's no denying the United States' edge in foundational research, elite talent, and venture capital. Still, this advantage will become increasingly irrelevant if the outputs of that ecosystem can't scale in the physical world. The collapse of GPU access is not the main constraint. The real pain is the inability to run those GPUs at full capacity without risking grid instability, skyrocketing energy costs, or political backlash. Infrastructure paralysis is the true bottleneck U.S. grid limitations are already manifesting in tangible ways. Energy costs are rising at double the rate of inflation, fueled not by generation expenses, but by transmission and distribution constraints. According to a recent Lawrence Berkeley Lab study, retail energy revenues have increased over 20% since 2019, despite flat consumption, a sign that grid strain is inflating costs across the board. China's vertically integrated strategy, linking energy, compute, and enterprise software, gives it an asymmetrical advantage. And because deployment leads to data feedback loops, performance refinement, and long-tail monetization, first-mover status matters. Each AI solution embedded today creates a platform for expansion tomorrow. Trump's executive order may mark a turning point, at least in tone. It acknowledges that building models alone isn't enough. Without fast-tracked permitting for data centers, energy infrastructure, and next-gen factories, the U.S. will remain mired in regulatory inertia. To counter China's momentum, the U.S. must broaden its AI policy lens. Chips matter, but energy infrastructure matters just as much, if not more. Federal investment in AI-aligned power infrastructure, especially clean, high-density energy sources like nuclear and utility-scale solar, must become a policy priority. Benzinga Disclaimer: This article is from an unpaid external contributor. It does not represent Benzinga's reporting and has not been edited for content or accuracy. Market News and Data brought to you by Benzinga APIs
Share
Share
Copy Link
The rapid growth of AI data centers is straining the US power grid, causing concerns about infrastructure readiness and electricity costs. Meanwhile, China appears well-prepared to meet the energy demands of its AI ambitions.
The rapid expansion of artificial intelligence (AI) technologies is placing extraordinary demands on the US power grid, creating a critical bottleneck that threatens to impede the country's AI ambitions. Data centers, the backbone of AI infrastructure, are consuming unprecedented amounts of electricity, with projections indicating a potential tripling of US consumption by 2028
1
.Source: Quartz
The US Department of Energy forecasts 20 GW of new data center load by 2030, with data centers expected to consume 6% of total US power production by 2028, up from 4% in 2023
2
. This surge in demand is overwhelming grid operators, leading to connection delays and constraining data center growth. The situation has become so dire that some tech giants are resorting to building their own power plants to meet their energy needs1
.The AI-driven power demand is not only reshaping the technology landscape but also impacting the broader economy. Electricity prices in the US have already climbed more than 30% since 2020, with projections warning of another 8% nationwide rise by 2030
3
. In some states, like Virginia, the increase could reach 25%, raising concerns about who will bear the cost of necessary grid upgrades3
.Source: Benzinga
In stark contrast to the US situation, China appears well-positioned to meet the energy demands of its AI ambitions. According to experts, China has solved its energy problem through massive investments in advanced hydropower and nuclear technologies
1
. The country maintains an 80% to 100% power reserve, allowing it to easily absorb the demand from hundreds of new data centers1
4
.To address the growing demand, the US faces the need for substantial infrastructure investment. McKinsey projects that between 2025 and 2030, companies worldwide will need to invest $6 trillion into new data center capacity
4
. However, the US political and economic system may not be well-suited for such long-term investments, with private investors typically expecting returns within three to five years4
.Related Stories
In response to these challenges, both tech companies and utilities are exploring innovative solutions. Some data center operators are adopting more efficient, low-emission designs and investing in liquid and natural air cooling to reduce energy use
2
. Utilities and regional network operators are implementing incentives and market mechanisms to minimize the impact of data center load2
.Source: Tom's Hardware
The disparity in energy readiness between the US and China has significant implications for the global AI race. China's ability to "hit grand slams" in infrastructure development, coupled with its long-term, technocratic policy approach, gives it a substantial advantage
4
. This has led some experts to suggest that US policymakers should hope China remains a competitor rather than an aggressor, given the current inability to compete effectively on the energy infrastructure front4
.As the AI industry continues to evolve rapidly, the ability to provide stable, abundant, and affordable energy will likely play a crucial role in determining which countries lead the next wave of technological innovation. The US faces a critical juncture, needing to address its power infrastructure challenges to maintain its competitive edge in the global AI landscape.
Summarized by
Navi
[1]
[4]
24 Jul 2025•Policy and Regulation
15 Oct 2024•Technology
27 Jun 2025•Technology