4 Sources
[1]
How much energy does AI really use? The answer is surprising - and a little complicated
AI feels inescapable. It's everywhere: Your smartphone, Google, work tools. AI features promise to make life easier and more productive - but what exactly is the environmental impact of a quick chatbot query? As AI adoption continues to grow, so do the technology's energy costs. Made up of high-compute systems, AI requires a lot of data, which needs to be stored on large networks of computers known as data centers. Just like your personal computer, those gigantic centers need electricity -- as does the process of training an AI model, which relies on more compute than traditional computer functions. Also: How much energy does a single chatbot prompt use? This AI tool can show you But in the context of the energy we already use every day, from office lights and laptops to social media, how does that consumption actually compare? Can the technology's resource needs change or be improved over time? Is the time it supposedly saves worth the extra emissions? And what should you know about your personal AI footprint? We spoke with experts and researchers to help explain how AI really uses energy and answer your sustainability questions, complete with tips on what you can do. AI needs more resources to function than other kinds of technology. The amount of data AI systems ingest and the computing power required to run them set them apart from simpler computer tasks. An AI system is effectively a synthetic brain that needs to be fed billions of pieces of data in order to find the patterns between them. This is why larger-parameter models tend to be better at certain tasks -- an image model trained on four billion images of cats, for example, should produce a more realistic image of a cat than one trained on just 100 million. But all that knowledge needs to live somewhere. What you've heard described as "the cloud" is not an airy name for storage, but a physical data center, or a large campus that houses expansive networks of computers that process and store huge amounts of data and run complex queries. Also: AI data centers are becoming 'mind-blowingly large' While these large computing farms have always existed, primarily for enterprise cloud services, they're in more demand than ever as the AI race intensifies -- and as the tools themselves get cheaper and more accessible. "You have big companies that have been managing those as real estate assets," said John Medina, an SVP at Moody's. "Everyone only needed a little bit; they didn't need a ton of capacity." Now, he said, the pressure is on to serve a rapidly growing customer base. That demand is driving up energy use, and the more parameters a model has, the more compute it's using, said Vijay Gadepally, a senior staff member at MIT's Lincoln Laboratory and CTO at Radium, an AI infrastructure company. "You need more computing just to even store the model and be able to process it." With investment in AI only gaining speed, data center growth shows no signs of stopping. Shortly after taking office in January, President Donald Trump announced Project Stargate, a $500-billion initiative supported by companies including OpenAI, Softbank, and Oracle to build "colossal," 500,000-square-foot data centers. These companies are known as hyperscalers, a small but dominant group of corporations like Microsoft, Google, Meta, and AWS that are building the lion's share of infrastructure. Also: The future of computing must be more sustainable, even as AI demand fuels energy use However, Medina noted that the hype cycle may be inflating how much data center growth is AI-specific. "When we talk about hyperscalers, large data centers, AI data centers, we get confused. Most of it is for the cloud," he said, referring to services like storage and data processing. He noted that despite all the chatter, data centers are only processing a relatively small number of AI-related tasks. That said, the AI boom is shifting base standards in ways that make relativism harder to pin down. "In the past, you didn't have a huge need like this. Four megawatts were considered hyperscale," Medina said. "Now, 50, 100 megawatts is that minimum." As Sasha Luccioni, Ph.D., AI and climate lead at developer platform Hugging Face, admitted in a recent op-ed, we still don't really know how much energy AI consumes, because so few companies publicize data about their usage. However, several studies indicate energy consumption is on the rise, nudged along by a growing demand for AI. A 2024 Berkeley Lab analysis found that electricity consumption has grown exponentially in tandem with AI in recent years. GPU-accelerated servers - hardware specifically used for AI - multiplied in 2017; a year later, data centers made up nearly 2% of total annual US electricity consumption, and that number was growing annually by 7%. By 2023, that growth rate had jumped to 18%, and is projected to hit as much as 27% by 2028. Even if we can't splice how much data center energy is being spent on AI, the trend between more consumption and AI expansion is clear. Also: How your inefficient data center hampers sustainability - and AI adoption Boston Consulting Group estimates that data centers will account for 7.5% of all US electricity consumption by 2030, or the equivalent of 40 million US homes. Mark James, interim director of the Institute for Energy and the Environment at Vermont Law and Graduate School, offered another comparison. A large facility running at full capacity uses 1,000 megawatts per hour - "the same size as the peak demand of the state of Vermont -- 600,000+ people -- for months," he noted. Currently, global data centers use about 1.5% of the world's electricity, which is about the same as the entire airline industry. It's likely to surpass it; an April 2025 IEA report found that globally, data center electricity use has gone up 12% every year since 2017, which is "more than four times faster than the rate of total electricity consumption." Data centers, directly or indirectly propelled by AI, are starting to take up more space in the world's energy landscape, even as other energy usage appears to stay mostly the same. For some, that's reason to worry. "This is going to be a carbon problem very quickly if we're scaling up power generation," Gadepally warned. Want more stories about AI? Sign up for Innovation, our weekly newsletter. Others aim to put these numbers in context. While there's evidence AI is driving up energy costs, research also shows global energy consumption overall is on the rise. Newer data centers and GPUs are also more energy efficient than their predecessors, meaning they may create relatively less carbon. "These 100, 200-megawatt massive builds are using the most efficient technology -- they're not these old power guzzlers that the older ones are," Medina said. Even as data centers multiply, their predicted consumption curve may start to level out thanks to modern technology. Within AI energy use, not all types of AI share the same footprint. We don't have access to energy consumption data for proprietary models from companies like OpenAI and Anthropic (as opposed to open-source models). However, across all models, generative AI -- especially image generation -- appears to use more compute (and therefore create more emissions) than standard AI systems. An October 2024 Hugging Face study of 88 models found that generating and summarizing text uses more than 10 times the energy of simpler tasks like classifying images and text. It also found that multimodal tasks, in which models use image, audio, and video inputs, are "on the highest end of the spectrum" for energy use. When it comes to specific comparisons, research is all over the map on the resources AI uses. One study determined that asking ChatGPT to write a 100-word email uses an entire bottle of water -- a claim that's quickly circulated on social media. But is it true? "It's possible," said Gadepally. He pointed out that GPUs generate a lot of heat; even when being cooled by other methods, they still require water cooling as well. "You're using something like 16 to 24 GPUs for that model that may be running for 5 to 10 minutes, and the amount of heat that's generated, you can start to kind of do the math," he said. These systems don't just use any kind of water, either - they need clean, high-quality, potable water running through them. "These pipes, they don't want to clog them up with anything," Gadepally explained. "Many data centers are in areas with stressed watersheds, so that's something to keep in mind." New methods like immersion cooling, in which processors are immersed in a liquid-like mineral oil, show some promise for reducing water use and energy consumption compared to other cooling methods like fans. But the tech is still developing, and would need to be widely adopted to make an impact. Also: The best AI image generators of 2025: Gemini, ChatGPT, Midjourney, and more With proprietary data still murky, there are several other comparisons out there for how much energy chatbot queries use. Jesse Dodge, a researcher from nonprofit institute Ai2, has compared one ChatGPT query to the electricity used to power one light bulb for 20 minutes. The Hugging Face study noted that "charging the average smartphone requires 0.022 kWh of energy, which means that the most efficient text generation model uses as much energy as 9% of a full smartphone charge for 1,000 inferences, whereas the least efficient image generation model uses as much energy as 522 smartphone charges (11.49 kWh), or around half a charge per image generation." According to Gadepally, an AI model processing a million tokens -- roughly a dollar in compute costs -- emits about as much carbon as a gas-powered car does while driving five to 20 miles. But energy use also varies widely depending on the complexity of the prompt you're using. "Saying 'I want a short story about a dog' will likely use less compute than 'I would like a story about a dog that's sitting on a unicorn written in Shakesperean verse,'" he said. If you're curious about how your individual chatbot queries use energy, Hugging Face designed a tool that estimates the energy consumption of queries to different open-source models. Green Coding, an organization that works with companies to track the environment impact of their tech, designed a similar tool. While it's true that overall energy consumption appears to be increasing in part due to AI investment, researchers urge users to see energy consumption as relative. The metric that one ChatGPT query uses 10 times as much energy as a Google search has become standard, but is based on the now-outdated 2009 Google estimate that one Google search consumes 0.3 Watt-hours (Wh) of energy. It's hard to say whether that number has gone up or down today based on changes to the complexity of Google searches or increased chip efficiency. Either way, as data scientist and climate researcher Hannah Ritchie pointed out, that 0.3 Wh of energy needs to be put in perspective -- it's relatively small. She noted that in the US, average daily electricity usage is about 34,000 Wh per person. Using the outdated Google metric, a ChatGPT prompt is just 3 Wh; even with multiple queries a day, that's still not a huge percentage. Plus, tech that doesn't explicitly use AI already uses lots of data center bandwidth. "What are the hottest digital applications today? TikTok, Instagram Reels, YouTube searches, streaming, gaming -- all of these things are hosted from the cloud," said Raj Joshi, another analyst and SVP at Moody's. Also: The best AI image generators of 2025: Gemini, ChatGPT, Midjourney, and more He and Medina added that as AI features integrate with everything from gaming to enterprise tech, it's becoming increasingly hard to attribute specific energy demands to AI or non-AI applications. Within AI, however, model needs are evolving. "It's quite significant," Gadepally said of the energy increase compared to earlier in the technology's history. He noted that inference -- when a model makes predictions after it's been trained -- now accounts for much more of a model's lifetime cost. "That wasn't the case with some of the original models, where you might spend a lot of your effort training this model, but the inference is actually pretty easy -- there wasn't much compute that needed to happen." Because AI has become inextricably tied up in existing technology, experts say it's difficult to determine its specific impact. Whether to use it or not may come down to individual judgment more than hard numbers. "From a sustainability perspective, you have to balance the output of the AI with the use of the AI," Medina said. "If that output is going to save you time that you would have your lights on, your computer on, and you're writing something that takes you an hour, but [AI] can do it in five minutes, what's the trade-off there? Did you use more energy taking 30 minutes to write something that they can write you in one minute?" Also: How AI hallucinations could help create life-saving antibiotics To Medina's point, AI can also be used to advance research and technology that helps track climate change in faster, more efficient ways. Ai2 has launched several AI tools that help collect planetary data, improve climate modeling, preserve endangered species, and restore oceans. Referencing data from the Sustainable Production Alliance, AI video company Synthesia argues that AI-generated video produces less carbon than traditional methods of video production, which rely on travel, lighting, and other resource-intensive infrastructure. Regardless, parts of the industry are responding to concerns. In February, Hugging Face released the AI Energy Score Project, which features standardized energy ratings and a public leaderboard of where each model stands in its estimated consumption. Across the industry, organizations are exploring ways to improve AI sustainability over time. At MIT's Lincoln Lab, Gadepally's team is experimenting with "power-capping," or strategically limiting the power each processor uses to below 100% of its capacity, which reduces both consumption and GPU temperature. Chinese AI startup DeepSeek achieved a similar outcome by being more efficient with how it runs and trains its models, though they are still quite large. That approach can only go so far, though. "No one's figured out how to make a smaller model suddenly do better on high-quality image generation at scale," Gadepally said. Also: What is sparsity? DeepSeek AI's secret, revealed by Apple researchers Because he doesn't see demand for AI waning -- especially with on-device phone features multiplying -- Gadepally said efficiency and optimizing are solutions for now. "Can I improve my accuracy by one and a half percent instead of one percent for that same kilowatt hour of energy that I'm pumping into my system?" He added that switching data centers to just run on renewable energy, for example, isn't that easy, as these sources don't turn on and off as immediately as natural gas, a requirement for large-scale computing. But by slowing the growth curve of AI's consumption with tactics like power capping, it becomes easier to eventually replace those energy sources with renewable ones -- like replacing your home lightbulbs with LEDs. To move towards sustainability, he suggested companies consider being flexible about where they're doing compute, as some areas may be more energy efficient than others, or training models during colder seasons, when demands on a local energy grid are lower. An added benefit of this approach is that it helps lower processor temperatures without significantly impacting model performance, which can make their outputs more reliable. It also reduces the need for cooling using potable water. Benefits like this, as well as a resulting cost-effectiveness, are incentives for companies to make sustainability-forward changes. Gadepally believes companies have the right intentions toward sustainability; he thinks it's a question of whether they can implement changes fast enough to slow environmental damage. If you're worried about how your AI use impacts your carbon footprint, it's not so simple to untangle. Avoiding AI tools might not help reduce your carbon footprint the way other lifestyle choices can. Andy Masley, director of advocacy group Effective Altruism DC, compared the impact of asking ChatGPT 50,000 fewer questions (10 questions every day for 14 years) to other climate-forward actions from philanthropic network Founders Pledge. The results are pretty minuscule. "If individual emissions are what you're worried about, ChatGPT is hopeless as a way of lowering them," Masley wrote. "It's like seeing people who are spending too much money, and saying they should buy one fewer gumball per month." "It saves less than even the 'small stuff' that we can do, like recycling, reusing plastic bags, and replacing our lightbulbs," Ritchie added in a Substack post referencing Masley. "If we're fretting over a few queries a day while having a beef burger for dinner, heating our homes with a gas boiler, and driving a petrol car, we will get nowhere." Also: The best AI chatbots of 2025: ChatGPT, Copilot, and notable alternatives In the big picture, Masley and Ritchie are concerned that focusing on AI energy consumption could distract well-intentioned users from larger, more pressing climate stressors. Gadepally agreed that abstaining from AI only gets you so far. "In this day and age, it's almost like saying, 'I'm not going to use a computer,'" he said. Still, he has a few suggestions for improving the future of AI energy use and creating more transparency around the subject. Here are a few approaches you can try: With the right data, firms like Gadepally's can at least generate estimates of how much energy AI is using. Individuals can organize to ask AI companies to make this information public. The AI playing field is only getting more competitive; he said that theoretically, as with any other social value, if enough users indicate they care about the sustainability of their tools, it could become a market mover. Sustainability is often already a consideration in many corporatation-level decisions, especially when businesses are weighing vendors and services. Gadepally believes in the power of applying that culture to AI. If your business is licensing AI tools, he suggests asking for energy usage and sustainability data during negotiations. "If large companies demand this on multi-million dollar contracts that are working with account executives, that can get very far," he pointed out, as they already do for other line items like work travel. "Why wouldn't you ask about this, where it really does add up pretty quickly?" Be intentional about the quality of the model you choose for a query relative to your needs. "Almost every provider has multiple versions of the model -- we tend to use probably the highest quality one that we have access to," which can be wasteful, Gadepally noted. "If you're able to get away with something smaller, do that." As part of this, Gadepally encourages users to accept getting imperfect results more often. Back-and-forth prompt refinement, for example, can be done with a lower-quality model; once you perfect your prompt, you can try it with a more expensive, higher-parameter model to get the best answer. In addition to these goals, Michelle Thorne, director of strategy at The Green Web Foundation - a nonprofit "working towards a fossil-free internet" - urged tech companies to phase out fossil fuels across their supply chains and take steps to reduce harms when mining for raw materials. The industry at large is responding to sustainability questions with initiatives like the Frugal AI Challenge, a hackathon at the 2025 AI Action Summit, which took place in Paris this past February. Google said in its sustainability goals that it intends to replenish 120% of the freshwater it consumes across its offices and data centers by 2030. Some argue that the bigger-is-better approach in AI may not actually yield more value or better performance, citing diminishing returns. Also: Why neglecting AI ethics is such risky business - and how to do AI right Ultimately, however, regulation will likely prove more effective in standardizing expectations and requirements for tech companies to manage their environmental impact, within and beyond their use of AI. Long-term, AI expansion (and the costs that come with it) shows no signs of stopping. "We have sort of an insatiable appetite for building more and more technology, and the only thing that keeps you limited has been cost," Gadepally said -- a nod to Jevons Paradox, or the idea that efficiency only begets more consumption, rather than satisfaction. For now, AI's energy future is unclear, but the tech industry at large is an increasingly significant player in a climate landscape marked by skyrocketing demand and very little time.
[2]
AI could soon consume more electricity than Bitcoin mining and...
A hot potato: The global AI industry is quietly crossing an energy threshold that could reshape power grids and climate commitments. New findings reveal that the electricity required to run advanced AI systems may surpass Bitcoin mining's notorious energy appetite by late 2025, with implications that extend far beyond tech boardrooms. The rapid expansion of generative AI has triggered a boom in data center construction and hardware production. As AI applications become more complex and are more widely adopted, the specialized hardware that powers them, accelerators from the likes of Nvidia and AMD, has proliferated at an unprecedented rate. This surge has driven a dramatic escalation in energy consumption, with AI expected to account for nearly half of all data center electricity usage by next year, up from about 20 percent today. AI expected to account for nearly half of all data center electricity usage by next year, up from about 20 percent today. This transformation has been meticulously analyzed by Alex de Vries-Gao, a PhD candidate at Vrije Universiteit Amsterdam's Institute for Environmental Studies. His research, published in the journal Joule, draws on public device specifications, analyst forecasts, and corporate disclosures to estimate the production volume and energy consumption of AI hardware. The estimated power demand of AI accelerator modules and AI systems manufactured in 2023 and 2024, along with their cumulative power demand by 2025. The power demand is estimated assuming a utilization rate of 65% and a PUE of 1.2, with error bars indicating the impact of varying PUE values between 1.1 and 1.3 and utilization rates varying between 55% and 75%. Because major tech firms rarely disclose the electricity consumption of their AI operations, de Vries-Gao used a triangulation method, examining the supply chain for advanced chips and the manufacturing capacity of key players such as TSMC. The numbers tell a stark story. Each Nvidia H100 AI accelerator, a staple in modern data centers, consumes 700 watts continuously when running complex models. Multiply that by millions of units, and the cumulative energy draw becomes staggering. De Vries-Gao estimates that hardware produced in 2023 - 2024 alone could ultimately demand between 5.3 and 9.4 gigawatts, enough to eclipse Ireland's entire national electricity consumption. The estimated power demand of AI hardware by 2025 compared to the power demand of Ireland (2023), Switzerland (2023), Austria (2023), Finland (2022), the Netherlands (2023), Bitcoin mining (March 2025), the United Kingdom (2023), France (2023), and total data center power demand (excluding cryptocurrency mining, 2024). But the real surge lies ahead. TSMC's CoWoS packaging technology allows powerful processors and high-speed memory to be integrated into single units, the core of modern AI systems. De Vries-Gao found that TSMC more than doubled its CoWoS production capacity between 2023 and 2024, yet demand from AI chipmakers like Nvidia and AMD still outstripped supply. TSMC plans to double CoWoS capacity again in 2025. If current trends continue, de Vries-Gao projects that total AI system power needs could reach 23 gigawatts by the end of the year - roughly equivalent to the UK's average national power consumption. This would give AI a larger energy footprint than global Bitcoin mining. The International Energy Agency warns that this growth could single-handedly double the electricity consumption of data centers within two years. The percentage change in the estimated power demand of AI systems by 2025 by changing the assumptions used for making this estimate with a given percentage. The default assumption for CoWoS-S yield is 100%, so further increases are not possible. While improvements in energy efficiency and increased reliance on renewable power have helped somewhat, these gains are being rapidly outpaced by the scale of new hardware and data center deployment. The industry's "bigger is better" mindset - where ever-larger models are pursued to boost performance - has created a feedback loop of escalating resource use. Even as individual data centers become more efficient, overall energy use continues to rise. Behind the scenes, a manufacturing arms race complicates any efficiency gains. Each new generation of AI chips requires increasingly sophisticated packaging. TSMC's latest CoWoS-L technology, while essential for next-gen processors, struggles with low production yields. Meanwhile, companies like Google report "power capacity crises" as they scramble to build data centers fast enough. Some projects are now repurposing fossil fuel infrastructure, with one securing 4.5 gigawatts of natural gas capacity specifically for AI workloads. The environmental impact of AI depends heavily on where these power-hungry systems operate. In regions where electricity is primarily generated from fossil fuels, the associated carbon emissions can be significantly higher than in areas powered by renewables. A server farm in coal-reliant West Virginia, for example, generates nearly twice the carbon emissions of one in renewable-rich California. Yet, tech giants rarely disclose where or how their AI operates - a transparency gap that threatens to undermine climate targets. This opacity makes it challenging for policymakers, researchers, and the public to fully assess the environmental implications of the AI boom. Permalink to story:
[3]
How to cut the environmental impact of your company's AI use
By adopting practical strategies, there are actionable steps that organizations can take to align their AI strategies and sustainability goals. Conversations around the use of artificial intelligence (AI) in business typically centre around data security, ethical uses and the risk of over-reliance on the emerging technology. But there is also growing discussion and concern around AI's environmental footprint and rightly so. With AI having a significant impact on energy and water consumption, as well as global greenhouse gas (GHG) emissions, it is vital that the subject of environmental sustainability is included in discussions around responsible AI. Simultaneously, there's an undeniable sense of renewed hope that AI can positively contribute to the global transition to a lower carbon economy. As AI's impact rises alongside growing use of the technology, we must reflect on how it contributes to - and fights against - climate change. Over the past decade, demand for cloud computing and digital services has focused minds on energy optimization, with modern data centres increasingly adopting energy-efficient hardware and advance cooling systems leading to improved power usage effective averages across the industry. However, the reality is that the energy demands of AI, powered by data centres, significantly outpace current efficiency gains. A single ChatGPT inquiry consumes about five times more electricity than that of a web search, while it is also estimated that training a single language model such as GPT-3 uses electricity equivalent to the power consumed annually by 130 US homes. Then, there's the energy consumption of AI image generation. While an AI model generating text in response to 1,000 prompts requires as much 16% of a full smartphone charge, generating an image requires the equivalent of 100%. AI operations also come with high water demands as data centres require cooling systems to maintain optimal operating temperatures. GPT-3 is estimated to utilize about one 16-ounce bottle of water for every 10-50 responses it prepares and this quickly adds up when factoring in billions of queries - with AI's projected annual water withdrawal to reach 6.6 billion cubic metres by 2027. Lastly, operational logistics, including construction and maintenance, account for up to two-thirds of a data centre's lifetime emissions. Emissions from building infrastructure, manufacturing IT equipment and chip manufacturing requirements of data centres contribute significantly to AI's environmental footprint. As AI workloads expand, so do their power demands. The World Bank estimates that AI's wider industry category of information and communications technology currently generates at least 1.7% of global emissions. While this percentage may seem modest, current figures only reflect today's reported AI usage and not tomorrow's actual consumption. AI adoption continues to accelerate - and also considering increased global digital penetration, expansion of cloud storage, internet of things uptake and the growing prevalence of cryptocurrencies and blockchain technologies - the collective impact on climate change could become significantly larger, even as environmental efficiencies are factored in. Under current growth trajectories, the International Energy Agency (IEA) predicts global data centres may consume up to 1,000 TWh of electricity in 2026 - an increase of 400% from 2022. Meanwhile, reports from major tech giants support these projections with Google and Microsoft recently disclosing year-on-year increases in emissions, warning these trends challenge their climate commitments. That said, many businesses are still innovating new ways to respond to the impacts of their energy use. Many prominent organizations associated with cloud computing and AI development have taken strong stances on tackling emissions and climate. For example: Amid continued investment in computing infrastructure and AI development, looking to industry leaders can provide inspiration for improving energy management and reducing emissions. As we consider the need to address the environmental impact of AI use as part of global efforts to build a sustainable future, organizations of all sizes can pull their weight in the following ways: Amid the public discourse around AI and its undeniable environmental impact, many argue that it can also be a powerful tool in the fight against climate change - and they're right. AI can quickly and effectively analyse significant volumes of data to provide insights into climate trends and the effectiveness of emissions reduction strategies that can lead to faster action. Additional capabilities for AI to positively contribute to fighting climate change include: Maintaining awareness of the sustainability expectations of investors, suppliers and consumers, and tracking and reporting energy use and emissions, is best practice for organizations wanting to tackle AI-driven emissions. Publishing an annual emissions inventory that is subject to third-party assurance or by becoming ISO 14001 certified, gives stakeholders confidence that you are committed to sustainability, while ISO 42001 certification also demonstrates a commitment to using ethical, trustworthy and accountable AI systems. It's also vital to ensure you keep up with and emerging sustainability and AI regulations as they evolve. AI's energy and water consumption remains a hot topic - be it in the media or with large ICT corporations seeing it impact their ambitious climate goals. From startups to global enterprises, everyone must do what they can to curb the resource consumption and emissions driven by AI use and make efforts to preserve our planet. By learning about the measures being taken from leading businesses and adopting practical strategies, there are actionable steps your organization can take today to align your AI strategies and sustainability goals. The time to act is now.
[4]
Tomorrow's Energy Crisis? MIT's Alarming Analysis of AI Power Consumption
Enter your email to get Benzinga's ultimate morning update: The PreMarket Activity Newsletter When you request an AI model to create a social media post or generate a Picasso-inspired animal graphic, there's a small but quantifiable energy cost associated with it, accompanied by emissions released into the atmosphere. While each individual query typically consumes less energy than briefly using a deep fryer, the effect might seem negligible. However, as more people rely on AI tools, these cumulative impacts become increasingly significant. According to Morgan Stanley, power demand from generative AI will increase at an annual average of 70% through 2027, mostly from the growth of AI-dependent data centers. AI's growing dependency on energy-intensive processes is not only reshaping infrastructure at an unprecedented scale but also challenging the global push toward sustainability. As AI systems become more pervasive, their energy consumption is accelerating, and with it, questions about the long-term implications for the environment, energy resources, and technology itself. From data centers to inference models, the power demands of AI are poised to grow exponentially in the coming years and experts are taking note. Some of AI's best minds gathered in Cambridge, Massachusetts, May 5-7, 2025, at the EmTech AI Conference, to discuss the sustainability implications of artificial intelligence. Editorial members of the esteemed MIT Technology Review publication explored the scope of AI's energy impact, where it's heading, and why it's imperative that decisions made today balance innovation with responsibility. Ultimately, these experts warn that we are at a critical juncture that will determine whether future AI innovations can coexist with global efforts to combat climate change. MIT Technology Review from left to right: Will Douglas Heaven, Sr. Editor; James O'Donnell, AI Reporter; Mat Honan, Editor in Chief The Growing Energy Demands of AI AI's energy usage comes from two primary operations: training models and inference tasks. Training complex AI models, such as large language models and image generation systems like ChatGPT, Gemini, Llama, and Claude, require massive computational power and energy resources. Training processes for advanced models can require weeks or even months of operating fleets of high-performance GPUs, processing vast datasets. The carbon costs of this process are significant, and they represent only part of the challenge. Inference, the step in which AI models respond to user inputs or requests (e.g., generating text, answering search queries, or creating art), is set to consume an even greater share of energy in the near future. Unlike training, inference happens on-demand and at scale. Every interaction with an AI system, whether a personal assistant like ChatGPT or a visual design tool, requires computing resources, creating a cumulative energy burden as these interactions multiply. "We're moving to this world where multiple AI models are answering billions of queries daily, and those inferences are already a massive and growing energy hog," explained Mat Honan, MIT Technology Review's editor in chief. "And if we want to keep up with the expected demand, we're going to need a lot more power than we currently have." The magnitude of this shift cannot be understated: as AI tools become more integrated into consumer and enterprise systems, the energy used to power these tools will continue to grow, raising complex questions about how much power our infrastructure can handle and its implications for the environment. So, where will this power come from? Just How Much Energy Do We Need? Establishing just how much energy will be required to meet future AI demands is tricky. Experts are merely speculating. "I can't emphasize enough, we truly don't know anything about how much energy is used up when you ask a query or generate a video," MIT Technology Review AI Reporter, James O'Donnell said. "We make estimates about how much energy is used specifically in inferencing, and there's all sorts of ways that you can do that, but it's all very indirect." And that's at the functional level, at the data center level it's still being figured out what energy sources are going to power this AI revolution. The Role of Data Centers: Feeding AI's Energy Hunger Data centers serve as the backbone of AI systems. These sprawling facilities house the servers and processors that enable AI models to function. However, their scale, complexity, and energy consumption are raising alarms globally. Take Nevada, for example, where tech giants like Google and Apple have built vast data centers in what is already one of America's driest states. These facilities consume massive amounts of energy to keep high-performance servers operating at optimum temperatures. Cooling technologies are vital in these desert locations, requiring significant water use and power to sustain the operations. Nearly half of the energy demands of a data center comes from the need to cool it. Apple solar data center, Nevada The problem isn't limited to Nevada. In Louisiana, Meta is constructing its largest-ever data center, a project geared to support its AI ambitions. However, this venture relies heavily on power from natural gas plants, undermining clean energy goals. Such developments reveal troubling contradictions: while companies acknowledge the importance of sustainability, they remain tied to energy sources that perpetuate dependence on fossil fuels. "Renewables, for the most part, lack the kind of consistency a data center requires," Honan pointed out. A surge in AI-driven electricity demand also places additional pressure on power grids, inadvertently accelerating the buildout of natural gas-based energy facilities. This reliance on fossil fuels, driven in part by AI's electricity needs, presents a serious challenge to climate objectives, as industries struggle to scale renewable solutions quickly enough to meet demand. Can Nuclear Power Solve AI's Energy Crisis? To meet the mounting energy demand posed by AI, many are turning to nuclear power as a potential solution. Viewed as a cleaner alternative to fossil fuels, nuclear energy offers a steady, reliable source of electricity that could, in theory, power future AI operations sustainably. However, there are significant barriers to this transition. "While nuclear may sound great, it takes years, sometimes a decade or more to bring a new plant online," Honan remarked. Additionally, policy hurdles, public resistance, and high costs further complicate nuclear adoption. As a result, it may be years -- if not decades -- before nuclear energy can meaningfully contribute to fueling AI's growth at scale. While the idea holds promise, it is far from an immediate solution to AI's energy dilemma. Source: Nuclear Innovation Alliance Companies are investing heavily in exploring nuclear options as part of their long-term plans to decarbonize data infrastructure, particularly overseas. China's first nuclear power plant was connected to the grid in 1991, and within a few decades, it has developed the third-largest nuclear fleet globally, behind only France and the United States. This year, China is expected to bring four large reactors online, with several more planned for commissioning in 2026. Why AI's Energy Problem Matters At present, AI's energy footprint constitutes a relatively small portion of global electricity use. However, its trajectory is concerning. Analysts warn that if unchecked, AI's growing power consumption could have a cascading effect on broader electrification plans and efforts to combat climate change. The energy requirements of data centers represent a microcosm of the broader electrification challenges humanity faces: as society adopts new technologies reliant on electricity, ensuring sustainable energy practices becomes increasingly critical. Moreover, AI's energy needs expose existing inequities in resource allocation. Communities hosting energy-intensive data centers may bear disproportionate environmental impacts, from water shortages to higher emissions associated with local power generation. This creates a moral imperative for policymakers and corporations to adopt equitable, sustainable strategies. The Case for Optimism: Toward Energy-Efficient AI Despite these challenges, there are reasons to feel optimistic about AI's energy future. Emerging technologies and innovations could significantly reduce the energy intensity of AI systems in years to come. Improvements in software and hardware, including the development of more efficient processors, are already underway. "On the software side, there's always going to be efficiencies made in the way AI models themselves are made," MIT Technology Review Senior Editor, Will Douglas Heaven, said. "We can be smarter about training with more curated, specialized data where you don't need to throw everything in it and then run models for weeks and weeks on it." Companies can also adopt better monitoring and reporting practices to enhance accountability. Transparent reporting on AI's energy use, coupled with clear targets for reducing environmental impact, could represent a critical step toward aligning technological innovation with climate goals. Renewable energy investments and technologies like liquid cooling are further reasons for cautious optimism, as they could minimize the environmental strain of data centers. Lastly, the increasing emphasis on sustainable infrastructure aligns AI innovation with global electrification policies. Governments and the private sector alike are exploring pathways to design AI systems that contribute to renewable energy networks rather than strain them. The future of AI in renewable energy is incredibly promising, according to AON, with numerous advancements and applications that are set to revolutionize the sector. Source: AON Website Looking Forward AI holds the potential to transform society in unprecedented ways. Yet, its rapid growth is a double-edged sword. The power-hungry nature of AI systems presents a litmus test for humanity's ability to manage technological progress responsibly while safeguarding our planet. As the energy demands of AI mount, urgent steps must be taken to align innovation with sustainability. Solutions such as energy-efficient systems, equitable infrastructure planning, and investments in renewable energy are critical to mitigating AI's environmental impact. Importantly, collaboration between industry leaders, governments, and scientists will be essential to ensuring that AI's immense potential benefits humanity without undermining our shared future. The coming years will define not just the trajectory of AI but also the legacy of our response to the challenges it poses on our planet. By addressing AI's energy consumption today, we can ensure that the technology powering the future is as sustainable as it is transformative. Feature image compliments of MIT EmTech Market News and Data brought to you by Benzinga APIs
Share
Copy Link
An in-depth look at the increasing energy demands of artificial intelligence, its environmental impact, and the challenges it poses for sustainability efforts worldwide.
As artificial intelligence (AI) becomes increasingly ubiquitous in our daily lives, its energy consumption is growing at an alarming rate. Recent studies and expert analyses reveal that the power demands of AI systems could soon surpass those of entire countries, raising significant concerns about sustainability and environmental impact 12.
Source: ZDNet
AI's energy usage primarily stems from two operations: training complex models and performing inference tasks. Training large language models like ChatGPT or Gemini requires massive computational power, often taking weeks or months of continuous GPU operation 3. However, it's the inference tasks β the on-demand interactions with AI systems β that are projected to consume an even greater share of energy in the near future 3.
The scale of AI's energy consumption is staggering. A single ChatGPT query uses about five times more electricity than a standard web search, while training a model like GPT-3 consumes electricity equivalent to the annual power usage of 130 US homes 4. By 2025, the total power needs of AI systems could reach 23 gigawatts β roughly equivalent to the UK's average national power consumption 2.
Data centers play a crucial role in supporting AI operations, but their rapid expansion is raising environmental concerns. These facilities require enormous amounts of energy not only for computing but also for cooling systems. In Nevada, tech giants like Google and Apple have built vast data centers in arid regions, straining local water resources 3. Similarly, Meta's largest-ever data center in Louisiana relies heavily on natural gas, contradicting clean energy goals 3.
The environmental footprint of AI extends beyond energy consumption. Water usage for cooling data centers is projected to reach 6.3 billion cubic meters annually by 2027 4. Additionally, the construction and maintenance of data center infrastructure contribute significantly to lifetime emissions 4.
According to the International Energy Agency, global data centers may consume up to 1,000 TWh of electricity in 2026 β a 400% increase from 2022 4. This surge in demand is challenging tech companies' climate commitments, with giants like Google and Microsoft reporting year-on-year increases in emissions 4.
As AI's impact on energy consumption grows, experts emphasize the need for responsible development. At the EmTech AI Conference in Cambridge, Massachusetts, leading minds in AI discussed the critical juncture we face in balancing innovation with environmental responsibility 3.
Source: Benzinga
Despite these challenges, AI also offers potential solutions to climate change. It can analyze vast amounts of data to provide insights into climate trends and emissions reduction strategies 4. Some practical steps organizations can take to reduce AI's environmental impact include:
As the AI industry continues to evolve, maintaining awareness of sustainability expectations and adhering to emerging regulations will be crucial for organizations looking to balance technological advancement with environmental stewardship 4.
In conclusion, while AI presents unprecedented opportunities for innovation, its growing energy demands pose significant challenges to global sustainability efforts. As we move forward, finding ways to harness AI's potential while minimizing its environmental impact will be essential for ensuring a sustainable future for both technology and our planet.
Summarized by
Navi
[3]
Meta has signed a multi-billion dollar deal with Constellation Energy to keep the Clinton Clean Energy Center nuclear power plant operational, aiming to support its growing AI and data center energy needs.
37 Sources
Business and Economy
18 hrs ago
37 Sources
Business and Economy
18 hrs ago
Nvidia's stock has rebounded by $1.5 trillion in two months, with investors optimistic about its future growth in AI chip market despite geopolitical challenges.
3 Sources
Technology
2 hrs ago
3 Sources
Technology
2 hrs ago
Snowflake launches Openflow, a new platform designed to streamline data integration and management for businesses in the age of AI, offering enhanced interoperability and simplified data pipelines.
4 Sources
Technology
18 hrs ago
4 Sources
Technology
18 hrs ago
AI-powered code generation startups are attracting massive investments and valuations, transforming the software development landscape. However, these startups face challenges including profitability concerns and competition from tech giants.
7 Sources
Technology
18 hrs ago
7 Sources
Technology
18 hrs ago
Snowflake introduces new AI-driven features including Snowflake Intelligence and Data Science Agent at its annual Summit, aiming to simplify data analysis and machine learning workflows for businesses.
9 Sources
Technology
18 hrs ago
9 Sources
Technology
18 hrs ago