3 Sources
3 Sources
[1]
The State of AI: Energy is king, and the US is falling behind
In the age of AI, the biggest barrier to progress isn't money but energy. That should be particularly worrying here in the US, where massive data centers are waiting to come online, and it doesn't look as if the country will build the steady power supply or infrastructure needed to serve them all. It wasn't always like this. For about a decade before 2020, data centers were able to offset increased demand with efficiency improvements. Now, though, electricity demand is ticking up in the US, with billions of queries to popular AI models each day -- and efficiency gains aren't keeping pace. With too little new power capacity coming online, the strain is starting to show: Electricity bills are ballooning for people who live in places where data centers place a growing load on the grid. If we want AI to have the chance to deliver on big promises without driving electricity prices sky-high for the rest of us, the US needs to learn some lessons from the rest of the world on energy abundance. Just look at China. China installed 429 GW of new power generation capacity in 2024, more than six times the net capacity added in the US during that time. China still generates much of its electricity with coal, but that makes up a declining share of the mix. Rather, the country is focused on installing solar, wind, nuclear, and gas at record rates. The US, meanwhile, is focused on reviving its ailing coal industry. Coal-fired power plants are polluting and, crucially, expensive to run. Aging plants in the US are also less reliable than they used to be, generating electricity just 42% of the time, compared with a 61% capacity factor in 2014. It's not a great situation. And unless the US changes something, we risk becoming consumers as opposed to innovators in both energy and AI tech. Already, China earns more from exporting renewables than the US does from oil and gas exports.
[2]
The AI revolution has a power problem
San Francisco (United States) (AFP) - In the race for AI dominance, American tech giants have the money and the chips, but their ambitions have hit a new obstacle: electric power. "The biggest issue we are now having is not a compute glut, but it's the power and...the ability to get the builds done fast enough close to power," Microsoft CEO Satya Nadella acknowledged on a recent podcast with OpenAI chief Sam Altman. "So if you can't do that, you may actually have a bunch of chips sitting in inventory that I can't plug in," Nadella added. Echoing the 1990s dotcom frenzy to build internet infrastructure, today's tech giants are spending unprecedented sums to construct the silicon backbone of the revolution in artificial intelligence. Google, Microsoft, AWS (Amazon), and Meta (Facebook) are drawing on their massive cash reserves to spend roughly $400 billion in 2025 and even more in 2026 -- backed for now by enthusiastic investors. All this cash has helped alleviate one initial bottleneck: acquiring the millions of chips needed for the computing power race, and the tech giants are accelerating their in-house processor production as they seek to chase global leader Nvidia. These will go into the racks that fill the massive data centers -- which also consume enormous amounts of water for cooling. Building the massive information warehouses takes an average of two years in the United States; bringing new high-voltage power lines into service takes five to 10 years. Energy wall The "hyperscalers," as major tech companies are called in Silicon Valley, saw the energy wall coming. A year ago, Virginia's main utility provider, Dominion Energy, already had a data-center order book of 40 gigawatts -- equivalent to the output of 40 nuclear reactors. The capacity it must deploy in Virginia, the world's largest cloud computing hub, has since risen to 47 gigawatts, the company announced recently. But some experts say the projections could be overblown. "Both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use," Jonathan Koomey, a renowned expert from UC Berkeley, warned in September. As with the late 1990s internet bubble, "many data centers that are talked about and proposed and in some cases even announced will never get built." Emergency coal If the projected growth does materialize, it could create a 45-gigawatt shortage by 2028 -- equivalent to the consumption of 33 million American households, according to Morgan Stanley. Several US utilities have already delayed the closure of coal plants, despite coal being the most climate-polluting energy source. And natural gas, which powers 40 percent of data centers worldwide, according to the International Energy Agency, is experiencing renewed favor because it can be deployed quickly. In the US state of Georgia, where data centers are multiplying, one utility has requested authorization to install 10 gigawatts of gas-powered generators. Some providers, as well as Elon Musk's startup xAI, have rushed to purchase used turbines from abroad to build capability quickly. Even recycling aircraft turbines, an old niche solution, is gaining traction. "The real existential threat right now is not a degree of climate change. It's the fact that we could lose the AI arms race if we don't have enough power," Interior Secretary Doug Burgum argued in October. Nuclear, solar, and space? Tech giants are quietly downplaying their climate commitments. Google, for example, promised net-zero carbon emissions by 2030 but removed that pledge from its website in June. Instead, companies are promoting long-term projects. Amazon is championing a nuclear revival through Small Modular Reactors (SMRs), an as-yet experimental technology that would be easier to build than conventional reactors. Google plans to restart a reactor in Iowa in 2029. And the Trump administration announced in late October an $80 billion investment to begin construction on ten conventional reactors by 2030. Hyperscalers are also investing heavily in solar power and battery storage, particularly in California and Texas. The Texas grid operator plans to add approximately 100 gigawatts of capacity by 2030 from these technologies alone. Finally, both Elon Musk, through his Starlink program, and Google have proposed putting chips in orbit in space, powered by solar energy. Google plans to conduct tests in 2027.
[3]
The AI revolution has a power problem
US tech giants face a new AI challenge: electricity shortages. Despite vast spending on chips and data centres, limited power and slow grid upgrades threaten progress. Firms are turning to coal, gas, and nuclear energy while softening climate pledges. Without enough power, America risks falling behind in the AI race. In the race for AI dominance, American tech giants have the money and the chips, but their ambitions have hit a new obstacle: electric power. "The biggest issue we are now having is not a compute glut, but it's the power and...the ability to get the builds done fast enough close to power," Microsoft CEO Satya Nadella acknowledged on a recent podcast with OpenAI chief Sam Altman. "So if you can't do that, you may actually have a bunch of chips sitting in inventory that I can't plug in," Nadella added. Echoing the 1990s dotcom frenzy to build internet infrastructure, today's tech giants are spending unprecedented sums to construct the silicon backbone of the revolution in artificial intelligence. Google, Microsoft, AWS (Amazon), and Meta (Facebook) are drawing on their massive cash reserves to spend roughly $400 billion in 2025 and even more in 2026 -- backed for now by enthusiastic investors. All this cash has helped alleviate one initial bottleneck: acquiring the millions of chips needed for the computing power race, and the tech giants are accelerating their in-house processor production as they seek to chase global leader Nvidia. These will go into the racks that fill the massive data centers -- which also consume enormous amounts of water for cooling. Building the massive information warehouses takes an average of two years in the United States; bringing new high-voltage power lines into service takes five to 10 years. Energy wall The "hyperscalers," as major tech companies are called in Silicon Valley, saw the energy wall coming. A year ago, Virginia's main utility provider, Dominion Energy, already had a data-center order book of 40 gigawatts -- equivalent to the output of 40 nuclear reactors. The capacity it must deploy in Virginia, the world's largest cloud computing hub, has since risen to 47 gigawatts, the company announced recently. Already blamed for inflating household electricity bills, data centers in the United Statescould account for 7 percent to 12 percent of national consumption by 2030, up from 4 percent today, according to various studies. But some experts say the projections could be overblown. "Both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use," Jonathan Koomey, a renowned expert from UC Berkeley, warned in September. As with the late 1990s internet bubble, "many data centers that are talked about and proposed and in some cases even announced will never get built." Emergency coal If the projected growth does materialize, it could create a 45-gigawatt shortage by 2028 -- equivalent to the consumption of 33 million American households, according to Morgan Stanley. Several US utilities have already delayed the closure of coal plants, despite coal being the most climate-polluting energy source. And natural gas, which powers 40 percent of data centers worldwide, according to the International Energy Agency, is experiencing renewed favor because it can be deployed quickly. In the US state of Georgia, where data centers are multiplying, one utility has requested authorization to install 10 gigawatts of gas-powered generators. Some providers, as well as Elon Musk's startup xAI, have rushed to purchase used turbines from abroad to build capability quickly. Even recycling aircraft turbines, an old niche solution, is gaining traction. "The real existential threat right now is not a degree of climate change. It's the fact that we could lose the AI arms race if we don't have enough power," Interior Secretary Doug Burgum argued in October. Nuclear, solar, and space? Tech giants are quietly downplaying their climate commitments. Google, for example, promised net-zero carbon emissions by 2030 but removed that pledge from its website in June. Instead, companies are promoting long-term projects. Amazon is championing a nuclear revival through Small Modular Reactors (SMRs), an as-yet experimental technology that would be easier to build than conventional reactors. Google plans to restart a reactor in Iowa in 2029. And the Trump administration announced in late October an $80 billion investment to begin construction on ten conventional reactors by 2030. Hyperscalers are also investing heavily in solar power and battery storage, particularly in California and Texas. The Texas grid operator plans to add approximately 100 gigawatts of capacity by 2030 from these technologies alone. Finally, both Elon Musk, through his Starlink program, and Google have proposed putting chips in orbit in space, powered by solar energy. Google plans to conduct tests in 2027.
Share
Share
Copy Link
US tech giants face a critical power shortage that threatens AI development, as massive data centers require more electricity than the grid can supply. While companies spend $400 billion on AI infrastructure, energy constraints force reliance on fossil fuels and delay climate commitments.
The artificial intelligence revolution has encountered an unexpected roadblock: electricity. Despite unprecedented investments in computing infrastructure, American tech giants are discovering that energy constraints, not financial resources, represent the most significant barrier to AI advancement
1
2
.
Source: Economic Times
Microsoft CEO Satya Nadella recently acknowledged this challenge, stating that "the biggest issue we are now having is not a compute glut, but it's the power and the ability to get the builds done fast enough close to power." This admission highlights a fundamental shift in the AI infrastructure landscape, where companies may find themselves with "a bunch of chips sitting in inventory that I can't plug in"
2
.
Source: France 24
The scale of current AI infrastructure investment is staggering. Google, Microsoft, Amazon Web Services, and Meta are collectively spending approximately $400 billion in 2025, with even higher expenditures planned for 2026
2
3
. This massive capital deployment has successfully addressed initial bottlenecks in chip acquisition, with tech giants accelerating in-house processor production to compete with industry leader Nvidia.However, the timeline mismatch between data center construction and power infrastructure development has created a critical bottleneck. While massive data centers can be built in approximately two years, bringing new high-voltage power lines into service requires five to ten years
2
.The energy demands of AI infrastructure are placing unprecedented strain on the electrical grid. In Virginia, the world's largest cloud computing hub, utility provider Dominion Energy's data center order book has grown from 40 gigawatts to 47 gigawatts in just one year β equivalent to the output of 47 nuclear reactors
2
.
Source: MIT Technology Review
Projections indicate that data centers could account for 7 to 12 percent of national electricity consumption by 2030, up from 4 percent today
3
. This growth is already impacting consumers, with electricity bills increasing in areas where data centers place growing loads on the grid1
.The urgent need for power is forcing tech companies to compromise their environmental commitments. Several US utilities have delayed the closure of coal-fired power plants despite coal being the most climate-polluting energy source
2
. Natural gas, which powers 40 percent of data centers worldwide, is experiencing renewed favor due to its quick deployment capabilities.Google exemplifies this shift, having quietly removed its 2030 net-zero carbon emissions pledge from its website in June
2
. In Georgia, one utility has requested authorization to install 10 gigawatts of gas-powered generators to meet data center demand.Related Stories
Despite immediate reliance on fossil fuels, tech giants are investing in long-term clean energy solutions. Amazon is championing Small Modular Reactors (SMRs), an experimental nuclear technology designed to be easier to build than conventional reactors. Google plans to restart a reactor in Iowa by 2029, while the Trump administration announced an $80 billion investment to begin construction on ten conventional reactors by 2030
2
.Renewable energy investments are also accelerating, particularly in solar power and battery storage in California and Texas. The Texas grid operator plans to add approximately 100 gigawatts of capacity by 2030 from these technologies alone
2
.The energy challenge has significant implications for global AI competitiveness. China installed 429 gigawatts of new power generation capacity in 2024, more than six times the net capacity added in the US during the same period
1
. While China still relies heavily on coal, it's rapidly expanding solar, wind, nuclear, and gas capacity.Interior Secretary Doug Burgum framed the challenge in stark terms: "The real existential threat right now is not a degree of climate change. It's the fact that we could lose the AI arms race if we don't have enough power"
2
. This perspective reflects growing concerns that energy constraints could undermine America's technological leadership.Summarized by
Navi
[1]
[2]
[3]
15 Oct 2024β’Technology

15 Aug 2025β’Business and Economy

27 Jun 2025β’Technology