Curated by THEOUTPOST
On Fri, 11 Apr, 12:04 AM UTC
19 Sources
[1]
Data centres will use twice as much energy by 2030 -- driven by AI
The electricity consumption of data centres is projected to more than double by 2030, according to a report from the International Energy Agency published today. The primary culprit? Artificial Intelligence (AI). The report covers the current energy footprint for data centres and forecasts their future needs, which could help governments, companies, and local communities to plan infrastructure and AI deployment. IEA's models project that data centres will use 945 terawatt-hours (TWh) in 2030, roughly equivalent to the current annual electricity consumption of Japan. By comparison, data centres consumed 415 TWh in 2024, roughly 1.5% of the world's total electricity consumption (see 'Global electricity growth'). The IEA report finds that the US, Europe, and China are collectively responsible for 85% of data centres' current energy consumption. Of the predicted growth in consumption, developing economies will account for around 5% by 2030, while advanced economies will account for more than 20% (see 'Data-centre energy growth'). The projections largely focus on data centres generally, which also run computing tasks other than AI. The report "is a bit vague when it comes to AI specifically," says Alex de Vries, a researcher at VU Amsterdam and the founder of Digiconomist, who was not involved with the report. The agency estimated the proportion of servers in data centres devoted to AI. They found that servers for AI accounted for 24% of server electricity demand and 15% of total data centre energy demand in 2024. De Vries thinks this is an underestimate. Even with these uncertainties, "we should be mindful about how much energy is ultimately being consumed by all these data centers," says de Vries. "Regardless of the exact number, we're talking several percentage of our global electricity consumption." Countries are building power plants and upgrading electricity grids to meet the forecasted energy demand for data centres. But the IEA estimates that 20% of planned centres could face delays being connected to the grid. The authors write that the projections come with uncertainty, partly because it is unclear to what extent people will use AI applications in the future. Data centres also currently collect and report their energy use in a limited way. The uptick in data centre electricity use "could be a serious risk for our ability to achieve our climate goals," says de Vries. Although two-thirds of planned electricity capacity is set to come from renewable sources, the IEA reports that new gas-fired plants in the US will drive an expansion in natural gas-fired capacity as well. "We're going to increase our reliance, or at least extend, our reliance on fossil fuels," adds de Vries. Electricity consumption and its projected growth are not uniform globally. Companies tend to build data centres in concentrated clusters, the IEA report describes, which could strain local grid systems.
[2]
AI Will Drive Doubling of Data Center Energy Demand by 2030
I agree my information will be processed in accordance with the Scientific American and Springer Nature Limited Privacy Policy. The electricity consumption of data centres is projected to more than double by 2030, according to a report from the International Energy Agency published today. The primary culprit? Artificial Intelligence (AI). The report covers the current energy footprint for data centres and forecasts their future needs, which could help governments, companies, and local communities to plan infrastructure and AI deployment. IEA's models project that data centres will use 945 terawatt-hours (TWh) in 2030, roughly equivalent to the current annual electricity consumption of Japan. By comparison, data centres consumed 415 TWh in 2024, roughly 1.5% of the world's total electricity consumption (see 'Global electricity growth'). If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today. The projections largely focus on data centres, which also run computing tasks other than AI. Although the agency estimated the proportion of servers in data centres devoted to AI. They found that servers for AI accounted for 24% of server electricity demand and 15% of total data centre energy demand in 2024. Alex de Vries, a researcher at VU Amsterdam and the founder of Digiconomist, who was not involved with the report, thinks this is an underestimate. The report "is a bit vague when it comes to AI specifically," he says. Even with these uncertainties, "we should be mindful about how much energy is ultimately being consumed by all these data centers," says de Vries. "Regardless of the exact number, we're talking several percentage of our global electricity consumption." The IEA report finds that the US, Europe, and China are collectively responsible for 85% of data centres' current energy consumption. Of the predicted growth in consumption, developing economies will account for around 5% by 2030, while advanced economies will account for more than 20% (see 'Data-centre energy growth'). Countries are building power plants and upgrading electricity grids to meet the forecasted energy demand for data centres. But the IEA estimates that 20% of planned centres could face delays being connected to the grid. The authors write that the projections come with uncertainty, partly because it is unclear to what extent people will use AI applications in the future. Data centres also collect and report their energy use in a limited way. The uptick in data centre electricity use "could be a serious risk for our ability to achieve our climate goals," says de Vries. Although two-thirds of planned electricity capacity is set to come from renewable sources, the IEA reports that new gas-fired plants in the US will drive an expansion in natural gas-fired capacity as well. "We're going to increase our reliance, or at least extend, our reliance on fossil fuels," adds de Vries. Electricity consumption and its projected growth are not uniform globally. Companies tend to build data centres in concentrated clusters, the IEA report describes, which could strain local grid systems.
[3]
AI and Data Centers Could Use as Much Energy as Japan by 2030
Expertise artificial intelligence, home energy, heating and cooling, home technology Data center growth spurred largely by generative AI is expected to cause global energy demand to surge in the coming years and a report this week from the International Energy Agency provides new estimates on just how much. The amount of electricity needed for data centers worldwide is projected to more than double to roughly 945 terawatt-hours in 2030, which is more than what the entire country of Japan consumes today, according to the report. The popularity of gen AI tools like OpenAI's ChatGPT and Google's Gemini has soared in the past few years. These large language models and their kin require a huge amount of computing power, running on high-end graphics processing units like those manufactured by Nvidia. These not only need a lot of electricity to operate but they also generate heat -- meaning even more energy is required to keep them cool. All of that adds up quickly. About half of the increased demand for electricity in the US by 2030 is expected to be for data centers. Processing data is expected to need more electricity than manufacturing all "energy-intensive goods" together -- aluminum, steel, cement and chemicals, the IEA said. "AI is one of the biggest stories in the energy world today -- but until now, policymakers and markets lacked the tools to fully understand the wide-ranging impacts," IEA Executive Director Fatih Birol said in a press release. The anticipated energy demand for AI has been well-documented and on the minds of policymakers and experts for years. Just this week, the House Energy and Commerce Committee held a hearing on how to provide enough energy for data centers as members of Congress try to understand the issue. Experts said the US electricity system will need improvements and changes to meet rising demand after decades of relatively flat consumption. One area of focus for the committee was on providing "baseload power" -- a consistent electricity supply throughout the day. That desire for consistency prompted questions from members about whether renewable energy sources like solar and wind would work or whether the rising demand would have to be met by new power plants burning fossil fuels like natural gas. The burning of those fossil fuels is a major driver of climate change. There's also the question of whether the existing US power grid can handle the demand and additional supply necessary. "Even without the anticipation of rapidly increasing electricity demand, the US power grid is in need of modernization investments," testified Melissa Lott, a professor at the Climate School at Columbia University. "The recent forecast for rapidly increasing power demand make these investments even more urgent and necessary." The timing of electricity availability, given surges in demand at times like heat waves and the daily up-and-down of solar generation, is a problem that can be dealt with in various ways. Lott pointed to energy efficiency efforts, like Energy Star appliances, and demand-reduction efforts also known as virtual power plants. Those can decrease energy demand and level out the peaks and valleys throughout the day. The IEA report suggests generative AI could, over time, help fix the problem caused by its energy demand. If AI accelerates science and technology improvements, it could lead to better solar panels and batteries, for example. "Energy innovation challenges are characterized by the kinds of problems AI is good at solving," the IEA report said. But another solution might lie in how AI data centers use power. A report earlier this year from researchers at Duke University suggested that AI data centers can more easily be turned off and on to adjust to the needs of the electrical system, allowing grid operators to accommodate growth more easily. "This analysis should not be interpreted to suggest the United States can fully meet its near and medium-term electricity demands without building new peaking capacity or expanding the grid," the researchers wrote. "Rather, it highlights that flexible load strategies can help tap existing headroom to more quickly integrate new loads, reduce the cost of capacity expansion, and enable greater focus on the highest-value investments in the electric power system."
[4]
Datacenter energy usage to more than double in next 5 years
No worries, just use neural networks to optimize systems powering neural networks Analysis Global datacenter electricity use is set to more than double by 2030 - slightly surpassing Japan's total consumption - with AI named as the biggest driver. That's the conclusion of the International Energy Agency, which recommends energy sector policies for multiple governments. The Paris-based agency is also banking on the same artificial intelligence to help ease the energy crunch the tech itself is fueling. The IEA this week published a sweeping study on the crucial intersection of AI and the energy it gobbles up, and believes datacenters will drive more than 20 percent of electricity demand growth in advanced economies over the next five years, with AI driving most of that energy consumption. The US, which accounted for 45 percent of global datacenter electricity consumption in 2024, according to the agency, is expected to see that share grow significantly by 2030. By then, American datacenters are set to consume more electricity than the country's entire energy-intensive manufacturing sector, including aluminium, steel, cement, and chemicals. In the United States, datacenters are on course to account for almost half of the growth in electricity demand "In the United States, datacenters are on course to account for almost half of the growth in electricity demand" over the next five years, said IEA executive director Fatih Birol. "With the rise of AI, the energy sector is at the forefront of one of the most important technological revolutions of our time." The report points to the need for more energy generation (renewable, ideally) to meet the demand - something that we've covered before and has been well established of late - but with a bit of a different take: Optimism that we can use AI to optimize systems to offset a lot of the emissions, if we use it right. "AI is a tool, potentially an incredibly powerful one, but it is up to us - our societies, governments and companies - how we use it," Birol said. The report cites several ways AI is already being used by the energy industry - improving generation, transmission, and consumption; and helping the oil and gas industry optimize exploration and production. It also calls for wider adoption of AI to detect faults in power grids, fine-tune heating and cooling systems, and boost industrial efficiency. "Energy is amongst the most complex and critical sectors in the world today, yet it can and should do more to seize the potential benefits of harnessing AI," the report said, adding that policy and regulatory shifts will be needed to unlock those gains. As for concerns that AI is making it harder to tackle climate change - by ramping up energy use and driving more fossil fuel burn to keep the machines humming - the IEA says those fears are overstated. While datacenters are among the fastest-growing sources of emissions, their share of total energy sector emissions through 2035 remains below 1.5 percent, even in the agency's worst-case scenario. On the flip side, the IEA doesn't expect AI to solve the climate change conundrum either. "The widespread adoption of existing AI applications could lead to emissions reductions that are far larger than emissions from datacenters," the IEA said, "but also far smaller than what is needed to address climate change." The IEA estimates that emissions reductions enabled by the widespread use of existing AI solutions could be equivalent to about five percent of global energy-related emissions by 2035. It's a dent, but nowhere near enough to seriously move the needle on climate change. The double-edged AI sword hanging over the energy sector - where ramping up AI adoption drives energy demand, even as the tech is touted as a tool to curb emissions - has a whole other element that has to be factored into calculations: Chip production. According to a report from Greenpeace published this week, global electricity consumption from AI chip manufacturing surged by more than 350 percent between 2023 and 2024, and could increase by as much as 170-fold by 2030 compared to 2023 levels. That would be enough for chipmaking - largely concentrated in Taiwan, South Korea, and Japan - to consume more electricity than Ireland does today, the group said. To make matters worse, most chipmakers in East Asia are meeting rising energy demand with fossil fuels. The report cited South Korea's approval of a one-gigawatt liquefied natural gas (LNG) plant for SK hynix and plans to build 3 GW of LNG capacity for Samsung. In Taiwan, the government is using increased power demand from the semiconductor and AI sectors to justify expanding LNG projects and grid infrastructure. While fabless hardware companies like Nvidia and AMD are reaping billions from the AI boom, they are neglecting the climate impact of their supply chains in East Asia "While fabless hardware companies like Nvidia and AMD are reaping billions from the AI boom, they are neglecting the climate impact of their supply chains in East Asia," Greenpeace East Asia supply chain project lead Katrin Wu said of the report. "[This is] demand that could, and should, be met by renewable energy sources." The IEA report noted that while hardware manufacturing for AI is energy intensive, it still "accounts for less energy than the operation phase" over a product's lifecycle. Not exactly reassuring, given the current surge in energy use for chip production - and the growing demand for AI hardware shows no sign of slowing down. But hey, as with COVID-19, if emissions don't get reported, maybe they didn't happen. The US Environmental Protection Agency (EPA) is reportedly taking a familiar approach: Cutting back transparency on greenhouse gas reporting. According to ProPublica, the EPA is reportedly drafting a rule to gut long-standing greenhouse gas reporting requirements, cutting the number of industrial sectors required to report emissions from 41 down to just one. That could make any data on how well AI is helping fight climate change basically worthless - if no one's tracking emissions, there's not much to measure. Great news for chipmakers expanding in the US - after all, it's hard to be blamed for contributing to climate change when no one's collecting the receipts. ®
[5]
AI energy demand to climb in 2025-26 despite efficiency gains | Insights | Bloomberg Professional Services
This analysis is by Bloomberg Intelligence Senior Industry Analysts Rob Barnett and Patricio Alvarez with contributing analysis by Alessio Mastrandrea. It appeared first on the Bloomberg Terminal. US power demand from data centers could surge 20-40% in 2025 with strong double-digit growth likely to persist in 2026-30, based on our analysis, despite efficiency gains from Ant Group, DeepSeek and others. Rapid power demand might lift consensus revenue for First Solar, Enphase and energy peers like RWE that supply solar, gas and battery storage. Efficiency gains unlikely to curb energy appetite Rapid growth in power demand from AI data centers might eventually be curbed if more efficient algorithms and processors are deployed. Ant Group on March 24 said it used Chinese-made semiconductors to develop new AI models that promise to cut costs by about 20%. We believe energy use will decline by a similar amount, based on the company's claims. New algorithms developed by DeepSeek might also deter power demand growth from AI, yet near-term rapid consumer adoption of AI tools may overwhelm promised efficiency gains.
[6]
Hyperscale sustainability is looking like a Hail Mary
Carbon capture, SMRs, fusion power - tech titans' climate strategies are packed with moon shots Comment AI's appetite for power is exploding. Hyperscalers have only just begun to adopt Nvidia's 120 kW-per-rack systems, and the GPU giant is already charting a course toward 600 kW designs. Faced with that kind of surging demand, Big Tech's environmental sustainability pledges are starting to look less like strategy and more like a last-ditch Hail Mary. By 2030, Microsoft has pledged to be carbon negative and Google has committed to a net-zero carbon footprint across its value chain. Meanwhile, e-commerce giant Amazon is taking a slightly longer road, targeting net-zero emissions by 2040. Laudable goals, sure, but as we've previously covered, the cloud titans' greenhouse gas emissions continue to grow year after year. Last year, Microsoft revealed its CO emissions had increased nearly 30 percent since 2020, while Amazon's and Google's were up 34.5 and 48 percent from 2019. Just how much of that is attributable to the AI boom, it's hard to say. None of these companies break out their datacenter emissions -- but it's hard to imagine the tens of thousands of power-hungry GPUs they've deployed over the past few years haven't made a sizable contribution. For context, as we reported earlier today, while datacenters are among the fastest-growing sources of emissions, their share of total energy sector emissions through 2035 is expected to be below 1.5 percent, according to the International Energy Agency. That share ranges from 300 to 500 megatonnes. With that in mind, and whether or not you see Big Tech's eco-pledges to the world as more PR drives than anything else, the fact is they made those promises, and promises can be broken. And in an attempt to stick to those goals, hyperscalers are increasingly hedging their bets on ever-more exotic methods of offsetting the emissions of their ever-growing datacenter empires. Earlier this month, Microsoft signed up for another bundle of carbon capture credits as part of a tie up with Terradot. Under the agreement, Terradot has committed to removing 12,000 tons of CO gas from the Earth's atmosphere between 2026 and 2029 through a process called enhanced rock weathering (ERW). Theorized in the 1990s, the process involves using mineral reactions to sequester atmospheric carbon in rock. This process normally happens naturally over the course of millions of years to form minerals, such as calcium carbonate. Startups like Terradot, however, aim to accelerate this process, hence the term "enhanced" rock weathering. As we understand it, Terradot achieves this speed up by spreading finely ground minerals over large swaths of land. It's believed the larger surface area means the minerals can react with carbon absorbed by rainwater to form new carbon-rich minerals, potentially an order of magnitude faster. The collaboration builds on Microsoft's earlier carbon capture partnership with 1PointFive for 500,000 metric tons of CO by 2030, using the company's direct air capture (DAC) approach to pull carbon from the air and store it in subsurface saltwater reservoirs. Microsoft is far from the only hyperscaler piling onto carbon capture. Amazon has also been working with 1PointFive and CarbonCapture Inc going back to 2023. Meanwhile, last year Google tapped Startup Holocene's DAC tech to help it clean up its act sometime in the 2030s. However, the tech remains in its infancy. As Climate researchers at MIT wrote in a 2023 article, while the idea behind enhanced rock weathering is sound, it is far from proven. One of the biggest challenges to ERW is that mining, grinding, and transporting rock requires a lot of energy. So, in order for ERW to have a net benefit, it needs to absorb more CO than it generates. There's some evidence to suggest that many of the best mineral candidates for ERW, like Olivine, could end up adding CO to the atmosphere through secondary reactions with iron. DAC-based capture suffers from many of the same core challenges as the technique is far less effective than capturing carbon at its source, the researchers explained. "The concentration of CO in the air is about 300 times less than in the smokestacks of power plants or industrial plants, making it much less efficient to capture. Because of this, DAC is quite expensive today," they wrote in a blog post. However, many believe that given time the cost of DAC carbon capture will come down, with Google hoping to see DAC reach $100 per ton by the early 2030s. However, unless hyperscalers scale up their carbon capture investments, it's unlikely to make a significant dent, in terms of their promises. In its most recent sustainability report, Microsoft said in 2023 alone it had contracted to capture 5 million metric tons of CO over a 15-year period. That might sound great until you realize the Windows maker generated 17.2 million metric tons of CO equivalent in fiscal year 2023 alone. Of course, hyperscalers aren't putting all of their sustainability eggs in a carbon capture shared basket. They're also betting big on nuclear energy to offset their massive GPU deployments and bail out their carbon zero and negative targets. These investments range anywhere from behind-the-meter deployments at existing power plants and rehabbing old reactors, to hedging their bets on small modular reactors and full-on moonshot fusion power promises. Two of the more realistic nuclear power datacenter projects are being led by Amazon and Microsoft. You may recall that early last year, the e-commerce turned cloud giant acquired Cumulus Data's atomic datacenters for $650 million. The acquisition brought AWS access to up to 960 megawatts of the Susquehanna nuclear power plant's total capacity, though since the announcement, the cloud provider has run into some speed bumps over just how much capacity it can actually claim. Similarly, Microsoft late last summer teamed up with Constellation Energy to reignite the idled Three Mile Island Unit 1 nuclear plant as part of a 20-year power purchase agreement. Before its retirement, TMI Unit 1 had a generating capacity of 837 megawatts, with Microsoft expected to draw a large portion of that capacity to power its increasingly power-hungry AI infrastructure. Analysts believe the plant could curb carbon emissions in the region by as much as 3 million tons a year. Microsoft was also one of the first to throw its weight behind small modular reactors (SMRs) -- tiny nuclear fission systems capable of producing tens to hundreds of megawatts of power depending on the design. The idea is that Redmond could deploy a couple of these in the area surrounding new datacenter builds to offset their impact on the energy grid. In 2023, Microsoft posted positions for a program manager to oversee the deployment of SMRs and microreactors to power its growing cloud and AI businesses. Since then Amazon and Google have announced plans to employ SMRs from X-Energy and Kairos Power, respectively. Even Oracle's Larry Ellison is getting in on the fun, boasting plans last year to deploy a trio of SMRs to power a gigawatt-scale AI datacenter. But if the idea of thousands of tiny atomic power plants dotting the countryside sounds far-fetched, SMRs aren't even the weirdest nuclear-themed hedge hyperscalers have made in recent years. Back in 2023, Redmond optimistically signed up to purchase power from Sam Altman-backed fusion startup Helion Energy, which promised to begin delivering clean helium-3-based power in or about 2028. The only rub is that nearly two years later, and Helion still hasn't figured out how the whole power part of the equation. When we spoke with them in 2023, they assured us their latest generation reactor was on track to begin producing power by 2024. Yet, as of January, the startup's reactor has yet to produce a meaningful number of electrons, much less the 50 megawatts of power promised. And herein lies the problem. While fusion energy, and SMRs for that matter, may be a long-term solution to hyperscalers' unrelenting energy demands, the technology is still years and years away from being a viable alternative to, say, fossil fuels. Even Three Mile Island's Unit One fission reactor isn't expected to be fully operational until 2028, and the way things are going may only be enough to offset the energy demands of modern GPU-bit barns. Large-scale deployments of small modular reactors, meanwhile, aren't expected until the late 2020s or mid-2030s, depending on whom you ask. Google's SMR supplier doesn't expect to have an operational fission reactor until at least 2030, while Amazon and X-Energy are looking at 2039. Even then the economics of SMRs remain dicey with some arguing the technology is "too expensive, too slow to build, and too risky to play a significant role in transitioning away from fossil fuels." While the cloud giants wait for the power of the atom to have its second renaissance, they continue to dump considerable capital into solar and wind energy. Last month, Amazon racked up some 870 megawatts of wind and solar power purchase agreements across Spain, while Microsoft in February signed a power purchase agreement, or PPA, for 389 megawatts worth of solar power in the US. Google, meanwhile, celebrated the end of 2024 by forming an alliance to build industrial parks powered by clean energy, presumably so it has a guilt-free place to put its datacenters. The challenge with wind and solar is that they aren't exactly conducive to running a facility requiring tens, hundreds or thousands of megawatts at all hours of the way. In the best-case scenario, it only generates peak power during daylight hours and wind when it's ... well, you know, blowing. Hence the big batteries you see in the news, to store surplus power until it's needed. With that said, Google is also augmenting its power supply with geothermal energy. The plant developed by Fervo Energy, is expected to deliver up to 115MW of power when complete, which would have sounded like a lot three years ago, but doesn't even raise an eyebrow today. So if wind and solar aren't enough and nuclear fusion and SMRs are still years away, what is there to be done? Well the AI bubble hasn't burst yet, so the strategy appears quite simple, albeit not exactly sustainable: "Burn baby burn." At the CERAWeek Global energy conference in Houston last month, Microsoft made it clear it wasn't above deploying more natural gas plants to meet its energy needs. Microsoft has previously funded generation plans in power-constrained locales like Dublin, where it operates a 170 megawatt gas plant. Facebook's parent Meta has also embraced natural gas to fuel its own AI model development. The company's forthcoming datacenter campus in Louisiana will be powered by a new 2.2 gigawatt natural gas plant. Meanwhile, developers in Pennsylvania announced plans to convert a former coal plant into a 4.5 gigawatt natural gas-fired datacenter campus. Suffice it to say, given the choice between sticking with their sustainability goals, or leading the AI race, we know which we'll be betting on. ®
[7]
An answer to AI's energy addiction? More AI, says the IEA
The International Energy Agency (IEA) has published its first major report on the AI gold rush's impact on global energy consumption -- and its findings paint a worrying, and perhaps contradictory, picture. Energy use from data centres, including for artificial intelligence applications, is predicted to double over the next five years to 3% of global energy use. AI-specific power consumption could drive over half of this growth globally, the report found. Some data centres today consume as much electricity as 100,000 households. The hyperscalers of the future could gobble up 20x that number, according to the IEA. By 2030, data centres are predicted to run on 50% renewable energy, the rest comprising a mix of coal, nuclear power, and new natural gas-fired plants. The findings paint a bleak picture for the climate, but there's a silver lining, the IEA said. While AI is set to gobble up more energy, its ability to unlock efficiencies from power systems and discover new materials could provide a counterweight. "With the rise of AI, the energy sector is at the forefront of one of the most important technological revolutions of our time," said Fatih Birol, IEA's executive director. "AI is a tool, potentially an incredibly powerful one, but it is up to us - our societies, governments, and companies - how we use it." AI can help to optimise power grids, increase the energy output of solar and wind farms through better weather forecasting, and detect leaks in vital infrastructure. The technology could also be used to more effectively plan transport routes or design cities. AI also has the potential to discover new green materials for tech like batteries. However, the IEA warned that the combined impact of these AI-powered solutions would be "marginal" unless governments create the necessary "enabling conditions." "The net impact of AI on emissions - and therefore climate change - will depend on how AI applications are rolled out, what incentives and business cases arise, and how regulatory frameworks respond to the evolving AI landscape," the report said. While AI could, theoretically, curb energy use, major questions remain. Meanwhile, the technology's negative climate impact is already set in. The IEA predicts data centres will contribute 1.4% of global "combustion emissions" by 2030, almost triple today's figure and nearly as much as air travel. While that doesn't sound like much, the IEA's figure doesn't account for the embodied emissions created from constructing all those new data centres and producing all the materials therein. Alex de Vries, a researcher at VU Amsterdam and the founder of Digiconomist, told Nature that he thinks the IEA has underestimated the growth in AI's energy consumption. "Regardless of the exact number, we're talking several percentage of our global electricity consumption," said de Vries. This uptick in data centre electricity use "could be a serious risk for our ability to achieve our climate goals," he added. Claude Turmes, Luxembourg's energy minister, accused the IEA of presenting an overly optimistic view and not addressing the tough realities that policymakers need to hear. "Instead of making practical recommendations to governments on how to regulate and thus minimise the huge negative impact of AI and new mega data centres on the energy system, the IEA and its [executive director] Fatih Birol are making a welcome gift to the new Trump administration and the tech companies which sponsored this new US government," he told the Guardian. Aside from AI, there are more proven ways to curb energy use from data centres. These include immersion cooling, pioneered by startups like Netherlands-based Asperitas, Spain's Submer, and UK-based Iceotope. Another is repurposing data centre heat for other applications, which is the value proposition of UK venture DeepGreen. All of these weird and wonderful solutions will need to scale up fast if they are to make a dent in data centres' thirst for electricity. Ultimately, we also need to start using computing power more wisely.
[8]
Global AI to gulp 945 TWh power by 2030, with US, China leading charge
The role of nuclear power in supplying electricity to data centers is projected to grow by 2035. (Representational image) Nuclear power's contribution to the data center electricity supply is likely to increase between 2013 and 2035, particularly in the US and China, according to a report by IEA. IEA's comprehensive special report, Energy and AI, has projected that energy demands for AI will double to around 945 TWh by 2030, a tad bit more than the entire electricity demands of Japan as of today. According to this report, AI will drive the increase in electricity consumption through AI-optimized data centers that are also projected to quadruple by 2030.
[9]
Energy demands from AI datacentres to quadruple by 2030, says report
The IEA forecast indicates a sharp rise in the requirements of AI, but said threat to the climate was 'overstated' The global rush to AI technology will require almost as much energy by the end of this decade as Japan uses today, but only about half of the demand is likely to be met from renewable sources. Processing data, mainly for AI, will consume more electricity in the US alone by 2030 than manufacturing steel, cement, chemicals and all other energy-intensive goods combined, according to a report from the International Energy Agency (IEA). Global electricity demand from datacentres will more than double by 2030, according to the report. AI will be the main driver of that increase, with demand from dedicated AI datacentres alone forecast to more than quadruple. One datacentre today consumes as much electricity as 100,000 households, but some of those currently under construction will require 20 times more. But fears that the rapid adoption of AI will destroy hopes of tackling the climate crisis have been "overstated", according to the report, which was published on Thursday. That is because harnessing AI to make energy use and other activities more efficient could result in savings that reduce greenhouse gas emissions overall. Fatih Birol, the executive director of the IEA, said: "With the rise of AI, the energy sector is at the forefront of one of the most important technological revolutions of our time. AI is a tool, potentially an incredibly powerful one, but it is up to us - our societies, governments and companies - how we use it." Using AI could make it easier to design electricity grids to take more renewable energy. Most grids were designed for centralised fossil fuel power stations that produce reliable levels of electricity, some of which can be turned off and on relatively quickly. They have to be redesigned to balance demand when more of the supply comes from intermittent and sometimes unpredictable sources, such as wind and solar power. Finding efficiencies within energy systems, and in industrial processes, could also become easier with AI. At present, huge opportunities to increase efficiency are missed, because it is harder for companies to change their processes than to carry on with wasteful practices. AI could also assist with new technologies such as driverless vehicles or detecting threats to vital infrastructure. The technology could also be used to plan public transport to optimise for people's journeys, or to design cities or traffic systems. Mining companies could use AI to discover and exploit reserves of critical minerals, which are crucial to modern renewable energy components such as solar panels, wind turbines and electric vehicles. These uses could offset some of the massive demands that AI will place on the world's energy systems. But that is likely to require greater direction from governments, the IEA report found. Left alone, the rapid growth of AI could prove a severe problem for energy systems and the environment. AI has the potential to reverse all the gains made in recent years in advanced economies to reduce their energy use, mainly through efficiencies. The rapid increase in AI also means companies will seek the most readily available energy - which could come from gas plants, which were on their way out in many developed countries. In the US, the demand could even be met by coal-fired power stations being given a new lease on life, aided by Donald Trump's enthusiasm for them. Done badly, AI could also suck water from some of the world's driest areas, an investigation by the Guardian revealed, as many AI datacentres use vast quantities of fresh water for cooling their computers. Claude Turmes, a former Green MEP and energy minister for Luxembourg, said the disadvantages of AI were more likely to materialise than the optimistic projections of the IEA, and governments needed much more help to avoid the pitfalls. He accused the IEA of painting too rosy a picture and failing to spell out harsh truths to policymakers. He said: "Instead of making practical recommendations to governments on how to regulate and thus minimise the huge negative impact of AI and new mega datacentres on the energy system, the IEA and its [chief] Fatih Birol are making a welcome gift to the new Trump administration and the tech companies which sponsored this new US government."
[10]
AI surge to double data center electricity demand by 2030: IEA
Electricity consumption by data centers will more than double by 2030, driven by artificial intelligence applications that will create new challenges for energy security and CO emission goals, the IEA said Thursday. At the same time, AI can unlock opportunities to produce and consume electricity more efficiently, the International Energy Agency (IEA) said in its first report on the energy implications of AI. Data centers represented about 1.5% of global electricity consumption in 2024, but that has increased by 12% annually over the past five years. Generative AI requires colossal computing power to process information accumulated in gigantic databases. Together, the United States, Europe, and China currently account for about 85% of data center consumption. Big tech companies increasingly recognize their growing need for power. Google last year signed a deal to get electricity from small nuclear reactors to help power its part in the artificial intelligence race. Microsoft is to use energy from new reactors at Three Mile Island, the site of America's worst nuclear accident, when it went through a meltdown in 1979. Amazon also signed an accord last year to use nuclear power for its data centers. At the current rate, data centers will consume about 3% of global energy by 2030, the report said. According to the IEA, data center electricity consumption will reach about 945 terawatt hours (TWH) by 2030. "This is slightly more than Japan's total electricity consumption today. AI is the most important driver of this growth, alongside growing demand for other digital services," said the report. One 100 megawatt data center can use as much power as 100,000 households, the report said. But it highlighted that new data centers, already under construction, could use as much as two million households. The Paris-based energy policy advisory group said that "artificial intelligence has the potential to transform the energy sector in the coming decade, driving a surge in electricity demand from data centers worldwide, while also unlocking significant opportunities to cut costs, enhance competitiveness, and reduce emissions". Hoping to keep ahead of China in the field of artificial intelligence, US President Donald Trump has launched the creation of a "National Council for Energy Dominance" tasked with boosting electricity production. Right now, coal provides about 30% of the energy needed to power data centers, but renewables and natural gas will increase their shares because of their lower costs and wider availability in key markets. The growth of data centers will inevitably increase carbon emissions linked to electricity consumption, from 180 million tonnes of CO today to 300 million tonnes by 2035, the IEA said. That remains a minimal share of the 41.6 billion tonnes of global emissions estimated in 2024.
[11]
Report estimates AI energy demands will quadruple in the next few years, with some large planned centres estimated to use the equivalent power of 5,000,000 households
It's fairly hopeful about AI's impact on the environment, though. One of the biggest concerns people have about AI is energy consumption, and a recent report suggests that things are only going to get worse in that regard over the next few years. However, the same report also argues there is a silver lining. A study from the Internal Energy Agency (IEA) titled Energy and AI was recently published (via The Guardian) using data gleaned from global datasets and consultation with "governments and regulators, the tech sector, the energy industry and international experts". In it, the paper suggests energy demands for data centres broadly will grow by double, and the use of AI data centres, specifically, will grow by a factor of four. The energy needed to supply these data centres is reported to grow from 460 TWh in 2024 to more than 1,000 TWh in 2030. This is then predicted to reach 1,300 TWh by 2035. Though traditional data centres are also projected to grow with time, AI-optimised servers are taking up the biggest equivalent share of that growth. As pointed out by IEA director of sustainability, Laura Cozzi, in a full presentation on the paper, the energy demand for individual data centres is growing over time. Hyperscale data centres (effectively the biggest ones) consume the equivalent power of 100,000 households today, with the largest under construction right now set to consume the equivalent of 2 million households. The largest currently announced (but not yet under construction) would consume the energy of 5 million households. However, the paper argues that this increased demand isn't all doom and gloom in regard to climate change. It starts the 'AI and climate change' section by pointing out that over 100 countries have deals to reach net zero emissions between 2030 and 2070. A forty-year gap is a rather nebulous one when the demand is surging now, and 'net zero' could mean quite a lot, on the grand scale. The fact that bringing in energy demands doesn't appear to be an argument made here suggests that net zero will be achieved with offsets, as opposed to regulation on AI. However, the argument made in 'Energy and AI' is largely that AI models, and their advancement, can be used to rein in inefficiencies in other energy sectors, reducing emissions in regards to methane, the power sector, and larger industry. The report argues "The adoption of existing AI applications in end-use sectors could lead to 1,400 Mt of CO2 emissions reductions in 2035". The report estimates global fuel combustion emissions equated to 35,000 Mt in 2024. This argued offset is inclusive of an adoption that the report suggests isn't currently happening. "It is vital to note that there is currently no momentum that could ensure the widespread adoption of these AI applications. Therefore, their aggregate impact, even in 2035, could be marginal if the necessary enabling conditions are not created." Talking to The Guardian, Claude Turmes, former secretary for sustainable development and infrastructure for Luxembourg, is critical of the report's findings. "Instead of making practical recommendations to governments on how to regulate and thus minimise the huge negative impact of AI and new mega data centres on the energy system, the IEA and its [chief] Fatih Birol are making a welcome gift to the new Trump administration and the tech companies which sponsored this new US government." The IEA report has a built-in AI chatbot on-page and when I asked it what the paper says about climate change, it told me: "The widespread adoption of existing AI applications could lead to emissions reductions equivalent to around 5% of energy-related emissions in 2035. However, this is still far smaller than what is needed to address climate change."
[12]
AI surge to double data centre electricity demand by 2030: IEA
Paris (AFP) - Electricity consumption by data centres will more than double by 2030, driven by artificial intelligence applications that will create new challenges for energy security and CO2 emission goals, the IEA said Thursday. At the same time, AI can unlock opportunities to produce and consume electricity more efficiently, the International Energy Agency (IEA) said in its first report on the energy implications of AI. Data centres represented about 1.5 percent of global electricity consumption in 2024, but that has increased by 12 percent annually over the past five years. Generative AI requires colossal computing power to process information accumulated in gigantic databases. Together, the United States, Europe, and China currently account for about 85 percent of data center consumption. Big tech companies increasingly recognise their growing need for power. Google last year signed a deal to get electricity from small nuclear reactors to help power its part in the artificial intelligence race. Microsoft is to use energy from new reactors at Three Mile Island, the site of America's worst nuclear accident, when it went through a meltdown in 1979. Amazon also signed an accord last year to use nuclear power for its data centres. At the current rate, data centres will consume about three percent of global energy by 2030, the report said. According to the IEA, data centre electricity consumption will reach about 945 terawatt hours (TWH) by 2030. "This is slightly more than Japan's total electricity consumption today. AI is the most important driver of this growth, alongside growing demand for other digital services," said the report. One 100 megawatt data centre can use as much power as 100,000 households, the report said. But it highlighted that new data centres, already under construction, could use as much as two million households. The Paris-based energy policy advisory group said that "artificial intelligence has the potential to transform the energy sector in the coming decade, driving a surge in electricity demand from data centers worldwide, while also unlocking significant opportunities to cut costs, enhance competitiveness, and reduce emissions". Hoping to keep ahead of China in the field of artificial intelligence, US President Donald Trump has launched the creation of a "National Council for Energy Dominance" tasked with boosting electricity production. Right now, coal provides about 30 percent of the energy needed to power data centres, but renewables and natural gas will increase their shares because of their lower costs and wider availability in key markets. The growth of data centers will inevitably increase carbon emissions linked to electricity consumption, from 180 million tonnes of CO2 today to 300 million tonnes by 2035, the IEA said. That remains a minimal share of the 41.6 billion tonnes of global emissions estimated in 2024.
[13]
Is AI a force for climate good or a carbon catastrophe?
Data centre electricity demand is predicted will rise to around 945 terawatt hours by 2030, more than the entire electricity consumption of Japan. Love it or hate it, artificial intelligence (AI) is becoming part of our everyday lives. From online shopping to searching the web, AI is evolving into a useful, time-saving tool for people and corporations alike. When it comes to climate change, AI is proving its usefulness there, too. At the UK's Cambridge University, researchers are using AI in everything from climate modelling to land use planning, and see it as a transformative tool for protecting nature. Researchers at Oxford University have created an AI tool that promises to make corporations' environmental conduct more transparent. Even Google has touted the benefits, developing various AI-powered tools to improve climate resilience. Despite all the potential for AI to have a positive impact on the climate crisis, there are concerns over its potentially significant contribution to greenhouse gas emissions. A new report from the International Energy Agency (IEA) shows that AI is driving a massive increase in electricity demand. Data centres, which form the backbone of AI systems, are projected to double their energy demand in the next five years. IEA projects that, by 2030, data centre electricity demand will rise to around 945 terawatt hours - that's more than the entire electricity consumption of Japan. However, the report also points out that AI has the potential to cut emissions elsewhere. It says that if it is adopted in the right ways, the carbon savings it accounts for could offset the additional emissions it generates. AI is one of the biggest stories in the energy world today," says IEA Executive Director Fatih Birol. "But until now, policy makers and markets lacked the tools to fully understand the wide-ranging impacts." AI requires large amounts of energy to train and run. The huge processing power required to support large language models comes from thousands of servers housed in data centres, some of which consume as much energy as a small country. Data centres are located all over the world, although the US leads with 5,381 facilities, around 40 per cent of the global market. Other countries with significant data centres include the UK, Germany, India, Australia, France and the Netherlands. The power consumption of these facilities is substantial. Some AI-focused data centres use as much electricity as two million households. In 2023, they accounted for around 1.5 per cent of the total global electricity consumption, but are set to consume a lot more in the coming years. Training AI requires a great deal of processor power, and therefore a lot of electricity. Research published in the Journal of Machine Learning found that training the popular OpenAI ChatGPT model consumed 1,287 megawatt hours of electricity, producing as much CO2 as 80 short-haul flights in Europe. "What is different about generative AI is the power density it requires, " says Noman Bashir, Computing and Climate Impact Fellow at MIT. "Fundamentally, it is just computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload." Running the software is less energy-intensive per task, but it quickly begins to add up when millions of queries are being submitted every day. The Electric Power Research Institute found that, per query, ChatGPT consumes approximately 2.9 watt-hours. That's around ten times the amount of energy required for a standard Google search. As of early 2025, ChatGPT is processing more than a billion queries per day, and the number is growing. In early 2025, around 8 per cent of US adults were using ChatGPT as their primary search engine. That's still a fraction of the number who use Google, but given it has grown from just 1 per cent in June 2024, this underscores the rapid shift towards AI-powered tools. There's also the changing face of AI to consider. Current queries are usually limited to text-based interactions. Emerging AI video, image and audio applications have no precedent, but are likely to be even more thirsty for fuel. "When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in," says Elsa A. Olivetti, professor in the Department of Materials Science at MIT. "There are much broader consequences that go out to a system level and persist based on actions that we take." According to the IEA, concerns that AI could accelerate climate change are 'overstated.' It says that, despite the growth, emissions caused by data centres will still be a fraction of the world's total energy-related emissions, an estimated 1.5 per cent. It further argues that widespread adoption of AI could make a host of activities more efficient, reducing emissions in other areas. This may be from the optimisation of industrial processes, scientific research or technology innovation. The IEA estimates that the broad application of existing AI-led solutions could lead to emissions reductions of up to 5 per cent by 2035. It claims this will offset the increase in emissions generated by data centre demand. A separate report from Energy Intelligence predicted a doubling of energy demand, but also framed AI as a key enabler of the clean energy transition. It cited smarter grid management, cost reduction in low-carbon technologies and enhanced integration of renewables as benefits AI could bring. It further argued that advances in processor efficiency, cooling technologies, and algorithm optimisation will ultimately curb AI's high energy demands. Although the IEA report looks favourably on the future of AI and its climate impact, it notes that this outcome is not automatic. "It is vital to note that there is currently no momentum that could ensure the widespread adoption of these AI applications," the report states. "Therefore, their aggregate impact, even in 2035, could be marginal if the necessary enabling conditions are not created." Realising AI's potential will require concentrated action on multiple fronts. In particular, it notes the positive impact AI could have in the energy industry through the optimisation of grids and distribution, one area in which AI is woefully underused at present. It also admits that investment in low-carbon electricity generation is crucial, particularly when it comes to supplying energy-hungry data centres. Some players are making strides in this. Amazon is the largest corporate buyer of renewable energy worldwide. It says that over 90 per cent of its operations, including its Amazon Web Services data centres, are already powered by renewables. Digital Reality, with over 300 data centres worldwide, has committed to renewable energy. Today, 100 per cent of its European portfolio's energy needs are matched with renewable energy purchases. But it's not easy going green in the data centre business. The intermittency of renewable energy sources presents a challenge, as do geographical limitations, which may impact the availability of clean energy sources. With most of the world's biggest data centres in the US, this will be where the largest growth in energy demand will be seen. By the end of the decade, energy consumption from data centres is projected to outstrip that of all other energy-intensive activities combined (production of aluminium, concrete, chemicals, etc.), according to the IEA report. Today, US data centres rely on fossil fuels, mainly natural gas. IEA doesn't see this changing, particularly with the current administration's focus on dirty fuels. Just this week, President Donald Trump signed an executive order instructing cabinet members to identify regions where coal-powered infrastructure can support AI data centres. In the state of Louisiana, plans are already in place to construct a large-scale gas power plant specifically to cater to a massive new data centre being built by Meta. IEA's report presents a scenario that will only be achievable with concerted efforts and political support. Depending on the priorities at the time, it's just as likely that AI could be used to find new oil and gas reserves as to detect methane leaks or optimise grids. Considering the notion of AI 'offsetting' its own emissions needs to be taken in context. Carbon dioxide stays in the atmosphere for hundreds of years, so even if AI does eventually find ways to cut more emissions than it produces, it won't cancel out the damage it will do along the way. "The widespread adoption of existing AI applications could lead to emissions reductions that are far larger than emissions from data centres - but also far smaller than what is needed to address climate change," the report concludes.
[14]
AI to double data centre energy demand by 2030
The electricity demand from 'AI-optimised' data centres is projected to more than quadruple by the end of the decade. Electricity demand from data centres is set to more than double from 415 TWh in 2024 to nearly 950 TWh by 2030 - and artificial intelligence is the primary driver, find the International Energy Agency (IEA) in its 'Energy and AI' report. The comprehensive report finds that AI will be the most significant driver of this surge, with electricity demand from "AI-optimised" data centres projected to more than quadruple by the end of the decade. Data centres in the US will account for nearly half of the growth in electricity demand in the country between 2025 and 2030, the IEA reports. Current estimates suggest that the US is the second highest energy consumer, using just more than 4,000 TWh of electricity annually. This however, is leagues behind than the top consumer, China, which sits at more than 8,300 TWh a year. The US government places significant emphasis on the growth of the AI sector. Earlier this year, president Donald Trump announced a private sector funding of $500bn into OpenAI's infrastructure, which is said to "secure American leadership in AI". More recently, AI-usage has been promoted and encouraged within government departments. "Driven by AI use, the US economy is set to consume more electricity in 2030 for processing data than for manufacturing all energy-intensive goods combined, including aluminium, steel, cement and chemicals," the IEA writes. While advanced economies - which includes much of the Global North - would drive more than 20pc of the growth in electricity demand between now and 2030. This development sets some of these countries back on a track for higher energy demand after years of being in stagnation or decline, the report says. A 2024 Central Statistics Office report showed that data centres accounted for more than a fifth of Ireland's electricity consumption at 21pc, overtaking urban dwellings, which consumed 18pc of the total electricity used in 2023. "AI is one of the biggest stories in the energy world today - but until now, policy makers and markets lacked the tools to fully understand the wide-ranging impacts," said IEA executive director Fatih Birol. "Global electricity demand from data centres is set to more than double over the next five years, consuming as much electricity by 2030 as the whole of Japan does today. The effects will be particularly strong in some countries. "For example, in the United States, data centres are on course to account for almost half of the growth in electricity demand - in Japan, more than half and in Malaysia, as much as one-fifth." Innovation is the silver lining Generative AI represents a technological milestone. And along with it comes both heightened security concerns as well as innovative opportunities, which could very well offset some concerns. Cyberattacks on energy utilities have tripled over the last four year, as well as proving to be more sophisticated than before. The IEA points to AI as the cause. At the same time, it has become a critical tool for businesses trying to defend itself against such attacks. While the increase in electricity demand for data centres led by AI will drive up emissions, this will be small in the context of the overall energy sector, the report finds. Moreover, AI could accelerate innovations in the energy industry, as the technology continues to become integral to scientific discoveries. This could lead to advancements in energy tech such as batteries and solar. "With the rise of AI, the energy sector is at the forefront of one of the most important technological revolutions of our time," Birol said. "AI is a tool, potentially an incredibly powerful one, but it is up to us - our societies, governments and companies - how we use it. Don't miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic's digest of need-to-know sci-tech news.
[15]
Amount of electricity needed to power world's data centres expected to double in five years
The amount of electricity needed to power the world's data centres is expected to double in the next five years, according to the International Energy Agency (IEA). It will come as racks of servers hosting the latest AI models and cloud computing services use three times more electricity than the UK each year, the agency added. The rise in demand, predicted to be highly concentrated around the world's tech and population hubs, will put pressure on utility companies, grid infrastructure and the planet. "AI is one of the biggest stories in the energy world today," says Fatih Birol, executive director of the IEA. "In the United States, data centres are on course to account for almost half of the growth in electricity demand; in Japan, more than half; and in Malaysia, as much as one-fifth." In the US, data centres, largely being built to train and operate AI, are expected to consume more electricity by 2030 than the manufacturing of all the nation's energy-intensive goods including aluminium, steel, cement and chemicals, a report from the IEA found. But the agency also predicts that AI will be an essential tool in informing how to manage future energy demand, engineer more efficient data centres and accelerate the development of new, cleaner sources of electricity generation. Two main shifts have driven the AI revolution and its incredible demand for power. The cost of "compute" - the processors and associated servers to build data centres - has fallen by 99% since 2006. Whereas the amount of compute being used to train and run state-of-the-art AI models has increased by a mind-boggling 350,000-fold in just a decade. Read more science and tech news: 'Concerning' levels of E.coli found in Thames Watchdog to investigate 'suicide forum' Wolf extinct for 10,000 years brought back to life Energy demand could outstrip supply Depending on the energy sources used, AI development could drive up carbon emissions and water consumption needed for cooling servers. American tech firms are already struggling to find enough power to run their growing data centre needs, as well as the computing hardware needed to run them. A survey by Reuters of 13 major US power providers found nearly half have received requests from data companies for power that would exceed their current peak demand. It's one of the key uncertainties in the IEA report. Unlikely to risk blackouts to meet AI energy demand, countries aggressively pursuing AI development will need to build far more electricity generation. It's not clear how quickly that might happen and also how quickly the energy efficiency of data centres and the AI models they are running improves. One of the greatest uncertainties are Donald Trump's tariffs, introduced after the report was completed. Yet the US president's attack on the global trade status quo could directly and significantly impact data centre and AI development in the US and beyond. High tariffs on China are predicted to choke off supplies of raw materials needed to build new energy infrastructure. In particular, those for low-carbon energy sources like solar panels, wind turbine motors and batteries to store renewable electricity. Demand for low-carbon generation was surging in the US before Mr Trump's election - a large chunk of that coming from tech companies wanting to power data centres. The US president has promised to boost US coal production to power AI, but it's far from certain if power companies will choose to build new plants given their high cost relative to some low-carbon alternatives. The time they take to build could also mean electricity supply will lag well behind the IEA's forecast for data centre electricity demand. China, on the other hand, already gaining fast on the US in terms of AI development, may find low-carbon electricity gets cheaper and quicker to build if its clean energy exports to the US dry up due to tariffs.
[16]
IEA warns: AI could double global data center energy use by 2030
Artificial intelligence might be the next big thing in clean energy or the next big problem. The International Energy Agency (IEA) has released its first major report on how the AI boom is reshaping global electricity demand, and the numbers are staggering. The headline? By 2030, data centers, supercharged by AI workloads, are projected to consume as much electricity as Japan does today. That's roughly 945 terawatt-hours annually, with AI responsible for more than half of that growth. The IEA's report confirms what many feared: data centers, once a relatively quiet corner of energy consumption, are becoming global power hogs. Some of today's facilities already use as much electricity as 100,000 households. The hyperscalers of the next decade? Expect 20 times that. In the US alone, data centers could account for almost half of all new electricity demand through 2030 -- eclipsing even the power needs of traditional heavy industries like steel and cement combined. But the report isn't all doom. The IEA argues that AI might also be key to solving energy efficiency challenges. From optimizing power grids and weather forecasting for renewables to detecting infrastructure leaks or designing energy-saving materials, AI could become a critical climate tool -- if governments set the right conditions. "AI is a tool, potentially an incredibly powerful one, but it is up to us - our societies, governments, and companies - how we use it," said IEA Executive Director Fatih Birol. Still, some experts aren't buying the optimism wholesale. Claude Turmes, Luxembourg's energy minister, slammed the IEA's framing as "a welcome gift" to US tech giants, accusing it of downplaying the scale of the problem and avoiding hard policy recommendations. And researchers like Alex de Vries from VU Amsterdam believe the IEA is underestimating AI's energy hunger. He told Nature that the rise of AI could soon account for a "serious risk" to climate goals, suggesting AI's share of global electricity use will hit several percentage points -- a significant burden on energy systems already struggling with decarbonization. In the meantime, some cooling and efficiency innovations are gaining attention. Startups like Asperitas (Netherlands), Submer (Spain), and Iceotope (UK) are experimenting with immersion cooling to reduce heat waste. Others, like UK's DeepGreen, are exploring ways to reuse data center heat for district heating or industrial processes. But these solutions, while promising, are nowhere near the scale needed to offset the projected surge in energy use from AI-driven workloads. The IEA's report makes one thing clear: the world is heading toward an energy future where AI both worsens and potentially solves its own emissions problem. By 2030, data centers are expected to source about 50% of their power from renewables -- with the rest coming from a mix of coal, nuclear, and natural gas. But even that transition depends heavily on investments in grids, new power plants, and smarter regulations. At the same time, AI itself is becoming integral to the energy sector's operations. Power companies are using AI not just to balance demand or integrate renewables, but also to defend against increasingly AI-powered cyberattacks -- which the IEA says have tripled in the past four years. The future impact of AI on emissions, the IEA concludes, won't be determined by technology alone. It'll depend on whether governments, industry, and regulators can guide its rollout intelligently -- incentivizing efficiency over waste, innovation over unchecked growth. Otherwise, the world might end up with AI powerful enough to design new clean energy systems -- while simultaneously burning through more electricity than entire nations just to answer your next prompt.
[17]
AI surge to double data centre electricity demand by 2030: IEA
Electricity consumption by data centres will more than double by 2030, driven by artificial intelligence applications that will create new challenges for energy security and CO2 emission goals, the IEA said Thursday. According to the IEA, data centre electricity consumption will reach about 945 terawatt hours (TWH) by 2030.Electricity consumption by data centres will more than double by 2030, driven by artificial intelligence applications that will create new challenges for energy security and CO2 emission goals, the IEA said Thursday. At the same time, AI can unlock opportunities to produce and consume electricity more efficiently, the International Energy Agency (IEA) said in its first report on the energy implications of AI. Data centres represented about 1.5 percent of global electricity consumption in 2024, but that has increased by 12 percent annually over the past five years. Generative AI requires colossal computing power to process information accumulated in gigantic databases. Together, the United States, Europe, and China currently account for about 85 percent of data center consumption. Big tech companies increasingly recognise their growing need for power. Google last year signed a deal to get electricity from small nuclear reactors to help power its part in the artificial intelligence race. Microsoft is to use energy from new reactors at Three Mile Island, the site of America's worst nuclear accident, when it went through a meltdown in 1979. Amazon also signed an accord last year to use nuclear power for its data centres. At the current rate, data centres will consume about three percent of global energy by 2030, the report said. According to the IEA, data centre electricity consumption will reach about 945 terawatt hours (TWH) by 2030. "This is slightly more than Japan's total electricity consumption today. AI is the most important driver of this growth, alongside growing demand for other digital services," said the report. One 100 megawatt data centre can use as much power as 100,000 households, the report said. But it highlighted that new data centres, already under construction, could use as much as two million households. The Paris-based energy policy advisory group said that "artificial intelligence has the potential to transform the energy sector in the coming decade, driving a surge in electricity demand from data centers worldwide, while also unlocking significant opportunities to cut costs, enhance competitiveness, and reduce emissions". Hoping to keep ahead of China in the field of artificial intelligence, US President Donald Trump has launched the creation of a "National Council for Energy Dominance" tasked with boosting electricity production. Right now, coal provides about 30 percent of the energy needed to power data centres, but renewables and natural gas will increase their shares because of their lower costs and wider availability in key markets. The growth of data centers will inevitably increase carbon emissions linked to electricity consumption, from 180 million tonnes of CO2 today to 300 million tonnes by 2035, the IEA said. That remains a minimal share of the 41.6 billion tonnes of global emissions estimated in 2024.
[18]
AI surge to double data center electricity demand by 2030, report says
Electricity consumption by data centers will more than double by 2030, driven by artificial intelligence applications that will create new challenges for energy security and CO2 emission goals, the International Energy Agency (IEA) said Thursday. At the same time, AI can unlock opportunities to produce and consume electricity more efficiently, the IEA said in its first report on the technology's energy implications. Data centers represented about 1.5% of global electricity consumption in 2024, but that has increased by 12% annually over the past five years. Generative AI requires colossal computing power to process information accumulated in gigantic databases. Together, the United States, Europe, and China currently account for about 85% of data center consumption. Big tech companies increasingly recognise their growing need for power. Google last year signed a deal to get electricity from small nuclear reactors to help power its part in the artificial intelligence race. Microsoft is to use energy from new reactors at Three Mile Island, the site of America's worst nuclear accident, when it went through a meltdown in 1979. Amazon also signed an accord last year to use nuclear power for its data centers. At the current rate, data centers will consume about 3% of global energy by 2030, the report said. According to the IEA, data center electricity consumption will reach about 945 terawatt hours (TWH) by 2030. "This is slightly more than Japan's total electricity consumption today. AI is the most important driver of this growth, alongside growing demand for other digital services," said the report. One 100-megawatt data center can use as much power as 100,000 households, the report said. But it highlighted that new data centers, already under construction, could use as much as 2 million households. The Paris-based energy policy advisory group said that "artificial intelligence has the potential to transform the energy sector in the coming decade, driving a surge in electricity demand from data centers worldwide, while also unlocking significant opportunities to cut costs, enhance competitiveness, and reduce emissions." Hoping to keep ahead of China in the field of artificial intelligence, U.S. President Donald Trump has launched the creation of a "National Council for Energy Dominance" tasked with boosting electricity production. Right now, coal provides about 30% of the energy needed to power data centers, but renewables and natural gas will increase their shares because of their lower costs and wider availability in key markets. The growth of data centers will inevitably increase carbon emissions linked to electricity consumption, from 180 million tonnes of CO2 today to 300 million tonnes by 2035, the IEA said. That remains a minimal share of the 41.6 billion tonnes of global emissions estimated in 2024.
[19]
Google Uses AI to Fix Grid Backlog as AI Power Demand Set To Quadruple by 2030
As tech giants race to build more data centers to train and run AI models, energy consumption is expected to surge significantly. | Credit: Joe Raedle / Getty Images. Alphabet's Google is partnering with PJM Interconnection, a leading electrical grid operator, to use artificial intelligence (AI) to accelerate the connection of new electricity supplies to the grid. The move comes amid soaring electricity demand driven by the rapid expansion of AI technologies. According to the International Energy Agency (IEA), U.S. AI-related power demand is projected to quadruple by 2030, surpassing the total energy usage of all other energy-intensive sectors combined. Google To Utilize AI for Power On Thursday, April 10, Google and PJM announced a partnership to reduce the waiting times for connecting the country's grid to new electricity supplies. The partnership initiative marks the first use of AI to manage an interconnection queue. Over the past few years, waiting times for connecting grids have grown to record lengths throughout the U.S. "The industry has been talking about building smarter grids for well over a decade, and now with AI, we have a real opportunity to turn discussion into action," Amanda Peterson Corio, Google's data center energy lead, said at a press conference. The project is being developed in partnership with Tapestry, an Alphabet-backed startup focused on modernizing the electrical grid to make it more "reliable, affordable, and sustainable." "This is really about automating a lot of the things that are being laboriously reviewed," said Page Crahan, General Manager of Tapestry. Eventually, the companies will be able to develop an interactive, detailed model of the PJM grid. "Because the grid wasn't built to sustain today's demands, there aren't adequate tools to see, manage, or plan it," Tapestry said on its website . "Information is siloed between dozens of different organizations, and no one has a complete picture of the grid." Crahan added that the eventual map will bring in "different layers that planners might need to see in a single toggle on-and-off view to guide faster decisions, introduce new insights, hopefully find efficiencies in those ways." PJM is North America's largest electrical grid operator, covering 67 million people. AI Power Demand Set to Quadruple by 2030 According to a report from the IEA, AI technologies will require as much power as Japan uses today by 2030, with only half of it likely to be powered by renewable energies. Despite advances in energy-efficient hardware and cooling systems, the sheer scale of AI deployment across industries is outpacing efficiency gains. However, the report claims that fears of AI destroying climate goals have been "overstated." The IEA reported that this is because AI will be able to utilize it to reduce other activities and boost the efficiency of systems, lowering the world's carbon footprint. "With the rise of AI, the energy sector is at the forefront of one of the most important technological revolutions of our time," Fatih Birol, the executive director of the IEA, said. "AI is a tool, potentially an incredibly powerful one, but it is up to us - our societies, governments, and companies - how we use it." In October, ex-Google CEO Eric Schmidt claimed industries should scrap their climate goals as AI will outpace any measures put in place, and likely be able to solve the problem once it matures. Speaking at a Washington AI summit, Schmidt said that companies will not meet their climate goals "because we're not organized to do it." "Yes, the needs in this area will be a problem, but I'd rather bet on AI solving the problem than constraining it and having the problem," Schmidt said.
Share
Share
Copy Link
The International Energy Agency reports that data center electricity consumption is projected to more than double by 2030, largely due to AI, raising concerns about energy infrastructure and climate goals.
The International Energy Agency (IEA) has released a report projecting that global data center electricity consumption will more than double by 2030, primarily driven by the rapid adoption of Artificial Intelligence (AI) technologies 1. This surge in energy demand poses significant challenges for infrastructure planning and climate goals.
According to the IEA's models, data centers are expected to consume 945 terawatt-hours (TWh) of electricity by 2030, equivalent to Japan's current annual electricity usage 2. This marks a substantial increase from the 415 TWh consumed in 2024, which represented approximately 1.5% of global electricity consumption.
The United States, Europe, and China collectively account for 85% of current data center energy consumption. By 2030, advanced economies are projected to contribute more than 20% to the growth in consumption, while developing economies will account for around 5% 1.
The IEA estimates that AI-specific servers accounted for 24% of server electricity demand and 15% of total data center energy demand in 2024 2. However, some experts, like Alex de Vries from VU Amsterdam, suggest this might be an underestimate.
In the United States, data centers are expected to drive nearly half of the growth in electricity demand over the next five years 4. By 2030, American data centers could consume more electricity than the country's entire energy-intensive manufacturing sector.
The projected energy demand raises concerns about infrastructure readiness and climate goals. Countries are building power plants and upgrading electricity grids to meet the forecasted demand, but the IEA estimates that 20% of planned centers could face delays in grid connection 1.
While two-thirds of planned electricity capacity is set to come from renewable sources, new gas-fired plants in the US will also contribute to an expansion in natural gas-fired capacity 1. This reliance on fossil fuels could pose challenges to achieving climate goals.
Despite the challenges, there are potential solutions on the horizon:
AI-driven optimization: The IEA suggests that AI could help solve energy challenges by improving solar panels, batteries, and other clean energy technologies 3.
Flexible load strategies: Researchers at Duke University propose that AI data centers can be more easily turned on and off to adjust to electrical system needs 3.
Efficiency gains: Companies like Ant Group and DeepSeek are developing more efficient algorithms and processors that could potentially reduce energy consumption 5.
While the energy demand from data centers is growing rapidly, the IEA notes that their share of total energy sector emissions through 2035 remains below 1% even in worst-case scenarios 4. However, the agency emphasizes the need for policy and regulatory shifts to unlock the potential benefits of AI in optimizing energy systems and addressing climate change challenges.
Reference
[2]
[4]
The rapid growth of AI is straining power grids and prolonging the use of coal-fired plants. Tech giants are exploring nuclear energy and distributed computing as potential solutions.
4 Sources
4 Sources
The rapid growth of artificial intelligence is causing a surge in energy consumption by data centers, challenging sustainability goals and straining power grids. This trend is raising concerns about the environmental impact of AI and the tech industry's ability to balance innovation with eco-friendly practices.
8 Sources
8 Sources
As artificial intelligence continues to advance, concerns grow about its energy consumption and environmental impact. This story explores the challenges and potential solutions in managing AI's carbon footprint.
5 Sources
5 Sources
The rapid growth in electricity demand from data centers, driven by AI and cloud computing, is leading to increased reliance on fossil fuels, potentially delaying the transition to clean energy.
7 Sources
7 Sources
A new study reveals that AI data centers in the US have tripled their carbon emissions since 2018, now rivaling the commercial airline industry. This surge is attributed to the AI boom and raises concerns about the environmental impact of AI technologies.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved