21 Sources
21 Sources
[1]
Google says it dropped the energy cost of AI queries by 33x in one year
So far this year, electricity use in the US is up nearly 4 percent compared to the same period the year prior. That comes after decades of essentially flat use, a change that has been associated with a rapid expansion of data centers. And a lot of those data centers are being built to serve the boom in AI usage. Given that some of this rising demand is being met by increased coal use (as of May, coal's share of generation is up about 20 percent compared to the year prior), the environmental impact of AI is looking pretty bad. But it's difficult to know for certain without access to the sorts of details that you'd only get by running a data center, such as how often the hardware is in use, and how often it's serving AI queries. So, while academics can test the power needs of individual AI models, it's hard to extrapolate that to real-world use cases. By contrast, Google has all sorts of data available from real-world use cases. As such, its release of a new analysis of AI's environmental impact is a rare opportunity to peer a tiny bit under the hood. But the new analysis suggests that energy estimates are currently a moving target, as the company says its data shows the energy drain of a search has dropped by a factor of 33 in just the past year. What's in, what's out One of the big questions when doing these analyses is what to include. There's obviously the energy consumed by the processors when handling a request. But there's also the energy required for memory, storage, cooling, and more needed to support those processors. Beyond that, there's the energy used to manufacture all that hardware and build the facilities that house them. AIs also require a lot of energy during training, a fraction of which might be counted against any single request made to the model post-training. Any analysis of energy use needs to make decisions about which of these factors to consider. For many of the ones that have been done in the past, various factors have been skipped largely because the people performing the analysis don't have access to the relevant data. They probably don't know how many processors need to be dedicated to a given task, much less the carbon emissions associated with producing them. But Google has access to pretty much everything: the energy used to service a request, the hardware needed to do so, the cooling requirements, and more. And, since it's becoming standard practice to follow both Scope 2 and Scope 3 emissions that are produced due to the company's activities (either directly, through things like power generation, or indirectly through a supply chain), the company likely has access to those, as well. For the new analysis, Google tracks the energy of CPUs, dedicated AI accelerators, and memory, both when active on handling queries and while idling in between queries. It also follows the energy and water use of the data center as a whole and knows what else is in that data center so it can estimate the fraction that's given over to serving AI queries. It's also tracking the carbon emissions associated with the electricity supply, as well as the emissions that resulted from the production of all the hardware it's using. Three major factors don't make the cut. One is the environmental cost of the networking capacity used to receive requests and deliver results, which will vary considerably depending on the request. The same applies to the computational load on the end-user hardware; that's going to see vast differences between someone using a gaming desktop and someone using a smartphone. The one thing that Google could have made a reasonable estimate of, but didn't, is the impact of training its models. At this point, it will clearly know the energy costs there and can probably make reasonable estimates of a trained model's useful lifetime and number of requests handled during that period. But it didn't include that in the current estimates. To come up with typical numbers, the team that did the analysis tracked requests and the hardware that served them for a 24 hour period, as well as the idle time for that hardware. This gives them an energy per request estimate, which differs based on the model being used. For each day, they identify the median prompt and use that to calculate the environmental impact. Going down Using those estimates, they find that the impact of an individual text request is pretty small. "We estimate the median Gemini Apps text prompt uses 0.24 watt-hours of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water," they conclude. To put that in context, they estimate that the energy use is similar to about nine seconds of TV viewing. The bad news is that the volume of requests is undoubtedly very high. The company has chosen to execute an AI operation with every single search request, a compute demand that simply didn't exist a couple of years ago. So, while the individual impact is small, the cumulative cost is likely to be considerable. The good news? Just a year ago, it would have been far, far worse. Some of this is just down to circumstances. With the boom in solar power in the US and elsewhere, it has gotten easier for Google to arrange for renewable power. As a result, the carbon emissions per unit of energy consumed saw a 1.4x reduction over the past year. But the biggest wins have been on the software side, where different approaches have led to a 33x reduction in energy consumed per prompt. The Google team describes a number of optimizations the company has made that contribute to this. One is an approach termed Mixture-of-Experts, which involves figuring out how to only activate the portion of an AI model needed to handle specific requests, which can drop computational needs by a factor of 10 to 100. They've developed a number of compact versions of their main model, which also reduce the computational load. Data center management also plays a role, as the company can make sure that any active hardware is fully utilized, while allowing the rest to stay in a low-power state. The other thing is that Google designs its own custom AI accelerators, and it architects the software that runs on them, allowing it to optimize both sides of the hardware/software divide to operate well with each other. That's especially critical given that activity on the AI accelerators accounts for over half of the total energy use of a query. Google also has lots of experience running efficient data centers that carries over to the experience with AI. The result of all this is that it estimates that the energy consumption of a typical text query has gone down by 33x in the last year alone. That has knock-on effects, since things like the carbon emissions associated with, say, building the hardware gets diluted out by the fact that the hardware can handle far more queries over the course of its useful lifetime. Given these efficiency gains, it would have been easy for Google to simply use the results as a PR exercise; instead, the company has detailed its methodology and considerations in something that reads very much like an academic publication. It's taking that approach because the people behind this work would like to see others in the field adopt its approach. "We advocate for the widespread adoption of this or similarly comprehensive measurement frameworks to ensure that as the capabilities of AI advance, their environmental efficiency does as well," they conclude.
[2]
In a first, Google has released data on how much energy an AI prompt uses
It's the most transparent estimate yet from a Big Tech company with a popular AI product, and the report includes detailed information about how the company calculated its final estimate. As AI has become more widely adopted, there's been a growing effort to understand its energy use. But public efforts attempting to directly measure the energy used by AI have been hampered by a lack of full access to the operations of a major tech company. Earlier this year, MIT Technology Review published a comprehensive series on AI and energy, at which time none of the major AI companies would reveal their per-prompt energy usage. Google's new publication, at last, allows for a peek behind the curtain that researchers and analysts have long hoped for. The study focuses on a broad look at energy demand, including not only the power used by the AI chips that run models but also by all the other infrastructure needed to support that hardware. "We wanted to be quite comprehensive in all the things we included," said Jeff Dean, Google's chief scientist, in an exclusive interview with MIT Technology Review about the new report. That's significant, because in this measurement, the AI chips -- in this case, Google's custom TPUs, the company's proprietary equivalent of GPUs -- account for just 58% of the total electricity demand of 0.24 watt-hours. Another large portion of the energy is used by equipment needed to support AI-specific hardware: The host machine's CPU and memory account for another 25% of the total energy used. There's also backup equipment needed in case something fails -- these idle machines account for 10% of the total. The final 8% is from overhead associated with running a data center, including cooling and power conversion. This sort of report shows the value of industry input to energy and AI research, says Mosharaf Chowdhury, a professor at the University of Michigan and one of the heads of the ML.Energy leaderboard, which tracks energy consumption of AI models. Estimates like Google's are generally something that only companies can produce, because they run at a larger scale than researchers are able to and have access to behind-the-scenes information. "I think this will be a keystone piece in the AI energy field," says Jae-Won Chung, a PhD candidate at the University of Michigan and another leader of the ML.Energy effort. "It's the most comprehensive analysis so far." Google's figure, however, is not representative of all queries submitted to Gemini: The company handles a huge variety of requests, and this estimate is calculated from a median energy demand, one that falls in the middle of the range of possible queries.
[3]
How Much Energy Do Your AI Prompts Consume? Google Just Shared Its Gemini Numbers
The explosion in use of AI tools across the world is increasing exponentially, but the environmental impact isn't expressed in detail often by the companies that make these tools. But Google has just released a technical paper detailing measurements for energy, emissions and water use of its Gemini AI prompts -- and the impact of a single prompt is, it says, minuscule. According to its methodology for measuring AI's impact, a single prompt's energy consumption is about the equivalent of watching TV for less than 9 seconds. That is quite low, but consider the variety of chatbots that are used, and that billions of prompts are easily sent every day. The good news is that the technology behind these prompts has become more efficient in the past 12 months. Google says that the energy of a single Gemini text prompt has reduced by 33x and total carbon footprint has reduced by 44x. That's not unsubstantial, and that type of momentum will need to be maintained going forward, Google says. Google did not immediately respond to CNET's request for further comment. The search giant says the typical calculation for the energy cost of an AI prompt ends at the active machine it's been run on, which shows a much smaller per-prompt footprint. But Google's method for measuring the impact of a prompt spans a much wider range of factors that paint a clearer picture, including full system dynamic power, idle machines, data center overhead, water consumption and more. For comparison, it's estimated that only using only the active TPU and GPU consumption, a single Gemini prompt uses 0.10Wh of energy, 0.12mL of water and emits 0.02 gCO2e. This is a minute and promising number, but Google's wider methodology tells a different story. With more considerations in place, a Gemini text prompt uses 0.24Wh of energy, 0.26mL of water and emits 0.03 gCO2e -- almost more than double across the board. Through a multi-layered series of efficiencies, Google is continually working on ways to make AI's impact grow smaller. From more efficient model architectures and data centers to custom hardware, Google's approach to addressing AI's impact on the world is a full-stack one. With smarter models, use cases and tools coming out by the day, those efficiencies will be much needed everywhere as we are steeped further into our new AI reality.
[4]
Worried about AI's soaring energy needs? Avoiding chatbots won't help - but 3 things could
Businesses and individuals have several options for managing their AI footprint. AI feels inescapable. It's everywhere: Your smartphone, Google, even your work tools. AI features promise to make life easier and more productive, but what does all this new tech mean for the environment? As AI investment grows -- as does user adoption -- so do the technology's energy costs. Made up of high-compute systems, AI requires a lot of data, which needs to be stored on large networks of computers known as data centers. Just like your personal computer, those gigantic centers need electricity -- as does the process of training an AI model, which relies on more compute than traditional computer functions. Also: Google reveals how much energy a Gemini query uses - in industry first But in the context of the energy we already use every day, from office lights and laptops to social media, how does that consumption actually compare? Can the technology's resource needs change or be improved over time? Is the time it supposedly saves worth the extra emissions? And what should you know about your personal AI footprint? We spoke with experts and researchers to help explain how AI really uses energy and answer your sustainability questions, complete with tips on what you can do. Here's what you need to know. AI needs more resources to function than other kinds of technology. The amount of data AI systems ingest and the computing power required to run them set them apart from simpler computer tasks. An AI system is effectively a synthetic brain that needs to be fed billions of pieces of data in order to find the patterns between them. This is why larger-parameter models tend to be better at certain tasks -- an image model trained on four billion images of cats, for example, should produce a more realistic image of a cat than one trained on just 100 million. But all that knowledge needs to live somewhere. What you've heard described as "the cloud" is not an airy name for storage, but a physical data center, or a large campus that houses expansive networks of computers that process and store huge amounts of data and run complex queries. Also: AI data centers are becoming 'mind-blowingly large' While these large computing farms have always existed, primarily for enterprise cloud services, they're in more demand than ever as the AI race intensifies -- and as the tools themselves get cheaper and more accessible. "You have big companies that have been managing those as real estate assets," said John Medina, an SVP at Moody's. "Everyone only needed a little bit; they didn't need a ton of capacity." Now, said, the pressure is on to serve a rapidly growing customer base. That demand is driving up energy use, and the more parameters a model has, the more compute it uses, said Vijay Gadepally, a senior staff member at MIT's Lincoln Laboratory and CTO at Radium, an AI infrastructure company. Also: Scammers have infiltrated Google's AI responses - how to spot them "You need more computing just to even store the model and be able to process it," he explains. The power these centers suck up is so much that areas surrounding data centers are experiencing distorted grid function and increased blackouts. To mitigate that, Google has agreed to scale back power usage at its data centers when asked to by utility companies during critical times, in order to keep space free on the grid. However, Thar Casey, founder and CEO of electrical provider Amber Semiconductor, thinks the move isn't enough. "It's a band-aid, because the energy consumption is only going to go up," he says. "There's nothing you can do about it." With investment in AI only gaining speed, data center growth shows no signs of stopping. Shortly after taking office in January, President Donald Trump announced Project Stargate, a $500-billion initiative supported by companies including OpenAI, Softbank, and Oracle to build "colossal," 500,000-square-foot data centers. These companies are known as hyperscalers, a small but dominant group of corporations like Microsoft, Google, Meta, and AWS that are building the lion's share of infrastructure. Also: The future of computing must be more sustainable, even as AI demand fuels energy use However, Medina noted that the hype cycle may be inflating the extent to which data center growth is AI-specific. When we talk about hyperscalers, large data centers, and AI data centers, we get confused. Most of it is for the cloud," he said, referring to services like storage and data processing. He noted that despite all the chatter, data centers are only processing a relatively small number of AI-related tasks. That said, the AI boom is shifting base standards in ways that make relativism harder to pin down. "In the past, you didn't have a huge need like this. Four megawatts were considered hyperscale," Medina said. "Now, 50, 100 megawatts is that minimum." As Sasha Luccioni, Ph.D., AI and climate lead at developer platform Hugging Face, admitted in a recent op-ed, we still don't really know how much energy AI consumes, because so few companies publicize data about their usage. Google just published estimates of its Gemini chatbot's emissions, energy, and water usage, which are lower than predicted; though they haven't been vetted by a third party, and the data itself is not public, the company says its methodology takes into account factors that inflated calculations overlooked. Also: Every AI model is flunking medicine - and LMArena proposes a fix That said, resources used by an individual query are just the tip of the iceberg, considering there are now hundreds of millions of monthly chatbot users, and 30% of Americans are actively using AI. Several studies indicate energy consumption is on the rise, nudged along by a growing demand for AI. A 2024 Berkeley Lab analysis found that electricity consumption has grown exponentially in tandem with AI in recent years. GPU-accelerated servers -- hardware specifically used for AI -- multiplied in 2017; a year later, data centers made up nearly 2% of total annual US electricity consumption, and that number was growing annually by 7%. By 2023, that growth rate had jumped to 18%, and is projected to hit as much as 27% by 2028. Even if we can't splice how much data center energy is being spent on AI, the trend between more consumption and AI expansion is clear. Boston Consulting Group estimates that data centers will account for 7.5% of all US electricity consumption by 2030, or the equivalent of 40 million US homes. Mark James, interim director of the Institute for Energy and the Environment at Vermont Law and Graduate School, offered another comparison. "These sort of facilities are going to run close to their full capacity -- per hour, it's 1,000 megawatts," he said. "That's the same size as the peak demand of the state of Vermont -- 600,000 plus people -- for months." Also: How your inefficient data center hampers sustainability - and AI adoption Currently, global data centers use about 1.5% of the world's electricity, which is about the same as the entire airline industry. It's likely to surpass it; an April 2025 IEA report found that globally, data center electricity use has gone up 12% every year since 2017, which is "more than four times faster than the rate of total electricity consumption." Data centers, directly or indirectly propelled by AI, are starting to take up more space in the world's energy landscape, even as other energy usage appears to stay mostly the same. For some, that's reason to worry. "This is going to be a carbon problem very quickly if we're scaling up power generation," Gadepally warned. Others aim to put these numbers in context. While there's evidence AI is driving up energy costs, research also shows global energy consumption overall is on the rise. Newer data centers and GPUs are also more energy efficient than their predecessors, meaning they may create relatively less carbon. "These 100, 200-megawatt massive builds are using the most efficient technology -- they're not these old power guzzlers that the older ones are," Medina said. Even as data centers multiply, their predicted consumption curve may start to level out thanks to modern technology. Within AI energy use, not all types of AI share the same footprint. We don't have access to energy consumption data for proprietary models from companies like OpenAI and Anthropic (as opposed to open-source models). However, across all models, generative AI -- especially image generation -- appears to use more compute (and therefore create more emissions) than standard AI systems. Also: Excel's new Copilot function turns your prompts into formulas - how to try it An October 2024 Hugging Face study of 88 models found that generating and summarizing text uses more than 10 times the energy of simpler tasks like classifying images and text. It also found that multimodal tasks, in which models use image, audio, and video inputs, are "on the highest end of the spectrum" for energy use. When it comes to specific comparisons, research on AI resources is all over the map. One study determined that asking ChatGPT to write a 100-word email uses an entire bottle of water -- a claim that quickly circulated on social media. But is it true? "It's possible," said Gadepally. He pointed out that GPUs generate a lot of heat; even when being cooled by other methods, they still require water cooling as well. "You're using something like 16 to 24 GPUs for that model that may be running for 5 to 10 minutes, and the amount of heat that's generated, you can start to kind of do the math," he said. Also: How much energy does a single chatbot prompt use? This AI tool can show you In June, OpenAI CEO Sam Altman wrote in a blog that the average ChatGPT query uses "roughly one fifteenth of a teaspoon" of water, but did not provide methodology or data to support either statement. For comparison, Google estimates that the median Gemini query "consumes 0.26 milliliters (or about five drops) of water." These systems don't just use any kind of water, either -- they need clean, high-quality, potable water running through them. "These pipes, they don't want to clog them up with anything," Gadepally explained. "Many data centers are in areas with stressed watersheds, so that's something to keep in mind." New methods like immersion cooling, in which processors are immersed in a liquid-like mineral oil, show some promise for reducing water use and energy consumption compared to other cooling methods like fans. But the tech is still developing, and would need to be widely adopted to make an impact. With most proprietary data still murky, several other comparisons exist for how much energy chatbot queries use. Jesse Dodge, a researcher from the nonprofit institute Ai2, has compared one ChatGPT query to the electricity used to power one light bulb for 20 minutes. The Hugging Face study noted that "charging the average smartphone requires 0.022 kWh of energy, which means that the most efficient text generation model uses as much energy as 9% of a full smartphone charge for 1,000 inferences, whereas the least efficient image generation model uses as much energy as 522 smartphone charges (11.49 kWh), or around half a charge per image generation." Also: The best AI image generators: Gemini, ChatGPT, Midjourney, and more According to Gadepally, an AI model processing a million tokens -- roughly a dollar in compute costs -- emits about as much carbon as a gas-powered car does while driving five to 20 miles. But energy use also varies widely depending on the complexity of the prompt you're using. "Saying 'I want a short story about a dog' will likely use less compute than 'I would like a story about a dog that's sitting on a unicorn written in Shakespearean verse,'" he said. If you're curious about how your individual chatbot queries use energy, Hugging Face designed a tool that estimates the energy consumption of queries to different open-source models. Green Coding, an organization that works with companies to track the environment impact of their tech, designed a similar tool. While it's true that overall energy consumption appears to be increasing in part due to AI investment, researchers urge users to see energy consumption as relative. The metric that one ChatGPT query uses 10 times as much energy as a Google search has become standard, but is based on the now-outdated 2009 Google estimate that one Google search consumes 0.3 Watt-hours (Wh) of energy. It's hard to say whether that number has gone up or down today based on changes to the complexity of Google searches or increased chip efficiency. Also: How web scraping actually works - and why AI changes everything Either way, as data scientist and climate researcher Hannah Ritchie pointed out, that 0.3 Wh of energy needs to be put in perspective -- it's relatively small. She noted that in the US, average daily electricity usage is about 34,000 Wh per person. Using the outdated Google metric, a ChatGPT prompt is just 3 Wh; even with multiple queries a day, that's still not a huge percentage. Plus, tech that doesn't explicitly use AI already uses lots of data center bandwidth. "What are the hottest digital applications today? TikTok, Instagram Reels, YouTube searches, streaming, gaming -- all of these things are hosted from the cloud," said Raj Joshi, another analyst and SVP at Moody's. Also: 5 ways automation can speed up your daily workflow - and implementation is easy He and Medina added that as AI features integrate with everything from gaming to enterprise tech, it's becoming increasingly hard to attribute specific energy demands to AI or non-AI applications. Within AI, however, model needs are evolving. "It's quite significant," Gadepally said of the energy increase compared to earlier in the technology's history. He noted that inference -- when a model makes predictions after it's been trained -- now accounts for much more of a model's lifetime cost. "That wasn't the case with some of the original models, where you might spend a lot of your effort training this model, but the inference is actually pretty easy -- there wasn't much compute that needed to happen." Because AI is now often inextricably built into existing technology, experts say it's difficult to determine its specific impact. Whether to use it or not may come down to individual judgment more than hard numbers. "From a sustainability perspective, you have to balance the output of the AI with the use of the AI," Medina said. "If that output is going to save you time that you would have your lights on, your computer on, and you're writing something that takes you an hour, but [AI] can do it in five minutes, what's the trade-off there? Did you use more energy taking 30 minutes to write something that they can write you in one minute?" Also: How AI hallucinations could help create life-saving antibiotics To Medina's point, AI can also be used to advance research and technology that helps track climate change in faster, more efficient ways. Ai2 has launched several AI tools that help collect planetary data, improve climate modeling, preserve endangered species, and restore oceans. Referencing data from the Sustainable Production Alliance, AI video company Synthesia argues that AI-generated video produces less carbon than traditional methods of video production, which rely on travel, lighting, and other resource-intensive infrastructure. Regardless, parts of the industry are responding to concerns. In February, Hugging Face released the AI Energy Score Project, which features standardized energy ratings and a public leaderboard that displays where each model stands in its estimated consumption. Additionally, the Trump administration and advocates for expanding AI are prepared to sacrifice environmental protections to stay ahead of China. Last month, the administration released its AI Action Plan, a set of policy guidelines that recommend the fastest, shortest path to US AI domination. In it, the administration says it plans to "reject radical climate dogma," reduce regulations, and "expedite environmental permitting" for new data centers and power plants. "The projected growth of AI data centers and their power needs already means that non-renewable and less clean sources of energy" -- like coal and natural gas -- "will likely be used in the short term," Yuvraj Agarwal, a software professor at Carnegie Mellon University, explained to ZDNET. "In the short term, that will cause higher emissions and increased burden on water infrastructure for cooling the data centers." Also: Is ChatGPT Plus still worth $20 when the free version offers so much - including GPT-5? He added that, in the long term, this could also mean fewer investments in renewable energy like solar, wind, and hydro, which he notes will impact climate change. Then there's the impact of the data centers -- and the new power plants nearby that will come with them -- on their surrounding environments. "Local communities may bear the unfair share of the environmental costs, depending on which environmental regulations are reduced in scope or sidestepped and for how long," Agarwal said. These impacts include increased pollution from data centers, which can harm public health. Across the industry, organizations are exploring ways to improve AI sustainability over time. At MIT's Lincoln Lab, Gadepally's team is experimenting with "power-capping," or strategically limiting the power each processor uses to below 100% of its capacity, which reduces both consumption and GPU temperature. Chinese AI startup DeepSeek achieved a similar outcome by being more efficient with how it runs and trains its models, though they are still quite large. That approach can only go so far, though. "No one's figured out how to make a smaller model suddenly do better on high-quality image generation at scale," Gadepally said. Also: What is sparsity? DeepSeek AI's secret, revealed by Apple researchers Because he doesn't see demand for AI waning -- especially with on-device phone features multiplying -- Gadepally said efficiency and optimization are solutions for now. "Can I improve my accuracy by 1.5% instead of 1% for that same kilowatt hour of energy that I'm pumping into my system?" He added that switching data centers to just run on renewable energy, for example, isn't that easy, as these sources don't turn on and off as immediately as natural gas, a requirement for large-scale computing. But by slowing the growth curve of AI's consumption with tactics like power capping, it becomes easier to eventually replace those energy sources with renewable ones -- like replacing your home lightbulbs with LEDs. To move toward sustainability, he suggested companies consider being flexible about where they're doing compute, as some areas may be more energy efficient than others, or training models during colder seasons, when demands on a local energy grid are lower. An added benefit of this approach is that it helps lower processor temperatures without significantly impacting model performance, which can make their outputs more reliable. It also reduces the need for cooling using potable water. Benefits like this, as well as the resulting cost-effectiveness, are incentives for companies to make sustainability-forward changes. At the data center level, several companies are investing in resource-saving efficiencies. "If we can minimize the amount of losses inside the data center, then we can win a little bit in the long run," Casey said. "When the electricity comes into the data center, from the time it enters the building to the time it powers up an AI chip, you could potentially lose as much as 40% of that energy." His company, Amber, creates modern silicon infrastructure that reduces these losses, which are often due to dated semiconductors. Also: AI is creeping into the Linux kernel - and official policy is needed ASAP Gadepally believes companies have the right intentions toward sustainability; he thinks the question is whether they can implement changes fast enough to slow environmental damage. If you're worried about how your AI use impacts your carbon footprint, it's not so simple to untangle. Avoiding AI tools might not help reduce your carbon footprint the way other lifestyle choices can. Andy Masley, director of advocacy group Effective Altruism DC, compared the impact of asking ChatGPT 50,000 fewer questions (10 questions every day for 14 years) to other climate-forward actions from philanthropic network Founders Pledge. The results are pretty minuscule. "If individual emissions are what you're worried about, ChatGPT is hopeless as a way of lowering them," Masley wrote. "It's like seeing people who are spending too much money, and saying they should buy one fewer gumball per month." "It saves less than even the 'small stuff' that we can do, like recycling, reusing plastic bags, and replacing our lightbulbs," Ritchie added in a Substack post referencing Masley. "If we're fretting over a few queries a day while having a beef burger for dinner, heating our homes with a gas boiler, and driving a petrol car, we will get nowhere." Also: The best AI chatbots: ChatGPT, Copilot, and notable alternatives In the big picture, Masley and Ritchie are concerned that focusing on AI energy consumption could distract well-intentioned users from larger, more pressing climate stressors. Gadepally agreed that abstaining from AI only gets you so far. "In this day and age, it's almost like saying, 'I'm not going to use a computer,'" he said. Still, he has a few suggestions for improving the future of AI energy use and creating more transparency around the subject. Here are a few approaches you can try: Also: Stop using AI for these 9 work tasks - here's why As part of this, Gadepally encourages users to accept getting imperfect results more often. Back-and-forth prompt refinement, for example, can be done with a lower-quality model; once you perfect your prompt, you can try it with a more expensive, higher-parameter model to get the best answer. In addition to these goals, Michelle Thorne, director of strategy at The Green Web Foundation -- a nonprofit "working towards a fossil-free internet" -- urged tech companies to phase out fossil fuels across their supply chains and take steps to reduce harms when mining for raw materials. The industry at large is responding to sustainability questions with initiatives like the Frugal AI Challenge, a hackathon at the 2025 AI Action Summit, which took place in Paris this past February. Google said in its sustainability goals that it intends to replenish 120% of the freshwater it consumes across its offices and data centers by 2030. Some argue that the bigger-is-better approach in AI may not actually yield more value or better performance, citing diminishing returns. Also: Why neglecting AI ethics is such risky business - and how to do AI right Ultimately, however, regulation will likely prove more effective in standardizing expectations and requirements for tech companies to manage their environmental impact, within and beyond their use of AI. Long-term, AI expansion (and the costs that come with it) shows no signs of stopping. "We have sort of an insatiable appetite for building more and more technology, and the only thing that keeps you limited has been cost," Gadepally said -- a nod to Jevons Paradox, or the idea that efficiency only begets more consumption, rather than satisfaction. For now, AI's energy future is unclear, but the tech industry at large is an increasingly significant player in a climate landscape marked by skyrocketing demand and very little time.
[5]
Google reveals how much energy a Gemini query uses - in industry first
Estimates are lower than public calculations, but industry-wide usage is still unclear. AI demand is rapidly accelerating, which means the infrastructure that makes it possible -- data centers and the power plants that supply them -- is expanding, too. The lack of concrete data around exactly how much energy AI uses has created concern and debate about how that demand is impacting the environment. New data from Google hopes to change that. Also: How much energy does AI really use? The answer is surprising - and a little complicated In an industry first, the company published estimates on its Gemini chatbot's energy usage and emissions. The average Gemini text prompt uses "0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water," Google said Thursday, comparing the per-prompt impact to "watching TV for less than nine seconds." Of course, that's just one average prompt. Google estimated Gemini had 350 million monthly users in March (almost half of ChatGPT user estimates); depending on how many are querying Gemini at any given moment, what enterprise clients are using the chatbot for, and power users sending more complex prompts, those seconds can add up. Google published a framework for tracking the emissions, energy, and water use of its Gemini apps, saying its findings are "substantially lower than many public estimates" of the resources AI consumes. For more details on the findings, keep reading. Google started publishing information on its global data center electricity usage in 2020, and provides annual reports on the Power Usage Effectiveness (PUE) of its data centers going back to 2008. Though Google did not publish its raw AI energy data, it is the first tech company to release granular reporting on the subject. Also: How web scraping actually works - and why AI changes everything In June, after alarming claims about how resource- and water-intensive ChatGPT use is circulated on social media, OpenAI CEO Sam Altman wrote in a blog that the average ChatGPT query uses "about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes." He added that a query uses "roughly one fifteenth of a teaspoon" of water, but did not provide methodology or data to support either statement. While reporting indicates Meta's data centers are using huge amounts of water, none of AI's major players themselves, including Anthropic, have shared specifics. According to Google, some AI resource calculations "only include active machine consumption" or focus solely on the inference cost of models, ignoring crucial factors that can make an AI system function more efficiently, and therefore with a smaller footprint. For example, larger reasoning models need more compute than smaller ones; to improve efficiency, approaches like speculative decoding (which Google uses) let fewer chips address more queries by having a smaller model make predictions that a larger model then verifies, as opposed to the larger model handling the entire process. In response, Google developed its own methodology for the report, taking into account several components that it said are often overlooked. In its testing, Google said it tracked not just the energy and water used by the model actively computing, but how chips are actually used at scale, which it said "can be much lower than theoretical maximums." Also: 30% of Americans are now active AI users, says new ComScore data The company monitored energy used beyond the TPUs and GPUs that AI runs on, factoring in host CPU and RAM as well, to ensure all components that contribute to an AI query were accounted for. It also included the energy used by "idle machines," or systems that must be on standby even when not actively computing to handle usage spikes, alongside infrastructure that's always in use, even for non-AI computation, like data center overhead, cooling systems, and water consumption. Google said it compared a "non-comprehensive" approach to its own: the former estimated that "the median Gemini text prompt uses 0.10 Wh of energy, emits 0.02 gCO2e, and consumes 0.12 mL of water" -- numbers Google said "substantially" underestimated Gemini's footprint and were "optimistic" at best. Its own methodology, on the other hand, showed higher estimates: 0.24 Wh, 0.03 gCO2e, and 0.26 mL of water comparatively. "We believe this is the most complete view of AI's overall footprint," Google said. Also: Cisco reveals its AI-ready data center strategy - boosted by its Nvidia partnership Despite revealing higher numbers, Google still said AI energy usage has been overhyped. "The energy consumption, carbon emissions, and water consumption were actually a lot lower than what we've been seeing in some of the public estimates," said Savannah Goodman, head of Google's advanced energy labs, in a video shared with ZDNET. Goodman did not cite specific estimates for comparison. The company said that "over a recent 12-month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively, all while delivering higher quality responses." However, Google added that neither the data nor the claims had been vetted by a third party. Google cited several approaches it's implementing in data centers to improve efficiency overall, which it says will decrease its AI emissions footprint. These include maximizing hardware performance, using hybrid reasoning, and distillation, or having larger models teach smaller ones. Google also reiterated commitments to using clean energy sources and replenishing the freshwater it uses for cooling. Also: Stop using AI for these 9 work tasks - here's why While the company's data center emissions might be down 12%, its latest sustainability report, released in June, showed Google's energy usage has more than doubled in just four years. Its data pertaining to Gemini appears less alarming than many other AI usage estimates out there, but that shouldn't be treated as evidence that Google is below energy usage norms for the tech industry, or is making larger-scale cuts -- especially given how popular Gemini is with users on a daily basis. As AI expands, energy efficiency has been top of mind for many -- but growth is happening too fast for environmental concerns to land. Reporting indicates AI demands are driving electricity and related resource use, which makes it an important component of our environmental future. A recent Reuters/Ipsos poll showed 61% of Americans are concerned about AI electricity use. Last month, President Trump pledged $92 billion toward AI infrastructure in Pennsylvania, an extension of the $500 billion Stargate initiative he announced shortly after taking office in January, alongside several companies, including OpenAI. The Trump administration's AI Action Plan, released last month, clarified intentions to "reject radical climate dogma," reduce regulations, and "expedite environmental permitting" for new data centers and power plants. Also: How the Trump administration changed AI: A timeline That said, if applied correctly, AI could also help curb emissions and create sustainable energy futures that could mitigate the impact of climate change. The more data the public has on AI's impact, the better it can advocate for sustainable applications. More metric sharing -- especially when company data finally gets vetted by independent third parties -- could create industry standards and competitive incentives for users and businesses to take emissions and energy use into account when selecting a model. Ideally, Google's report incentivizes other companies to share similar information on their own AI systems. While Google's numbers might give individual users some relief that their handful of queries isn't using an entire bottle of potable water, they can't be considered in a vacuum. As AI use goes up, these numbers will only continue to compound, unless data center infrastructure invests seriously in renewable energy sources -- a process experts say could be deprioritized given the rapid pace of the industry and the Trump administration's priorities.
[6]
Google games numbers to make AI look less thirsty
Datacenters' drinking habit exaggerated, claims report comparing apples to oranges AI's drinking habit has been grossly overstated, according to a newly published report from Google, which claims that software advancements have cut Gemini's water consumption per prompt to roughly five drops of water -- substantially less than prior estimates. Flaunting a comprehensive new test methodology, Google estimates [PDF] its Gemini apps consume 0.24 watt hours of electricity and 0.26 milliliters (ml) of water to generate the median-length text prompt. Google points out that's far less than the 45ml to 47.5ml of water that Mistral AI and researchers at UC Riverside have said are required to generate roughly a page's worth of text using a medium-sized model, like Mistral Large 2 or GPT-3. However, Google's claims are misleading because they draw a false equivalence between onsite and total water consumption, according to Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside and one of the authors of the papers cited in the Google report. To understand why, it's important to know that datacenters consume water both on and off site. The power-hungry facilities often employ cooling towers, which evaporate water that's pumped into them. This process chills the air entering the facility, keeping the CPUs and GPUs within from overheating. Using water is more power-efficient than refrigerant. By Google's own estimates, about 80 percent of all water removed from watersheds near its datacenters is consumed by evaporative cooling for those datacenters. But water is also consumed in the process of generating the energy needed to keep all those servers humming along. Just like datacenters, gas, coal, and nuclear plants also employ cooling towers which - spoiler alert - also use a lot of water. Because of this, datacenters that don't consume water directly can still have a major impact on the local watershed. The issue, Ren emphasizes, isn't that Google failed to consider off-site water consumption entirely. It's that the search giant compared apples to oranges: Its new figure is onsite only, while the "discredited" figure included all water consumption. "If you want to focus on the on-site water consumption, that's fine, but if you do that, you also need to compare your on-site data with the prior works' on-site data." Google didn't do that. And it's not like UC Riverside's research didn't include on-site estimates either. Google could have made an apples-to-apples comparison but chose not to, Ren contends. "Their practice doesn't follow the minimum standard we expect for any paper, let alone one from Google," he said. UC Riverside's 2023 paper titled "Making AI Less 'Thirsty': Uncovering and Addressing the Secret Water Footprint of AI Models" estimated on-site water consumption of the average US datacenter at 2.2ml per request. The 47.5ml per request, which Google rounded up to 50ml in its paper, represented the highest total water consumption Ren's team had recorded, and 2.8x higher than the US average. "They not only picked the total, but they also picked our highest total among 18 locations," Ren said. Google didn't address our questions regarding the comparison, instead providing a statement discrediting the UC Riverside researcher's earlier findings as flawed. "We have looked at the claims in the UC Riverside study, and our team of water resource engineers and hydrologists has concluded the claims and methods are flawed," Ben Townsend, head of infrastructure strategy and sustainability at Google said in a statement. "The critical flaw in the generalized UC Riverside study is that it assumes a grid powered predominantly by traditional, water-cooled thermoelectric plants, which does not hold true for Google's data center operations." Why Google proceeded to incorporate data on both on- and off-site water use from the UC Riverside paper, the search giant didn't say. While 0.26ml per prompt claimed by Google is still substantially lower than UC Riverside's on-site average of 2.2ml for US datacenters, Ren emphasizes that those figures were first published in 2023. If Google really has managed to cut Gemini's energy footprint by 33x over the past year, that would imply the model was far less water-efficient at the time Ren's team released their results as well. The idea that AI workloads would become more efficient with time isn't surprising, Ren notes. In the original paper, his team had predicted improvements in water and energy consumption. ®
[7]
How much power and water does AI use? Google, Mistral weigh in
The answers Google and Mistral provided offer just a basic estimate of the resources AI consumes. Still, it's a start. How badly does AI harm the environment? We now have some answers to that question, as both Google and Mistral have published their own self-assessments of the environmental impact of an AI query. In July, Mistral, which publishes its own AI models, published a self-evaluation of the environmental impact of training and querying its model in terms of the amount of carbon dioxide (CO2) produced, the amount of water consumed, and the amount of material consumed. Google took a slightly different approach, publishing the amount of power and water a Gemini query consumes, as well as how much CO2 it produces. Of course, there are caveats: Each report was self-generated, and not performed by an outside auditor. Also, training a model consumes vastly more resources than inferencing, or the day-to-day tasks users assign a chatbot each time they query it. Still, the reports provide some context for how much AI taxes the environment, even though they exclude the effects of AI training and inferencing by OpenAI and other competitors. On Thursday, Google said its estimate for the resources consumed by a "median" Gemini query consumes 0.24Wh of energy and 0.26 milliliters (five drops) of water, and generates the equivalent of 0.03 grams of carbon dioxide -- the equivalent of 9 seconds of watching TV. Mistral's report slightly differed: For a "Le Chat" response generating a page of text (400 tokens), Mistral consumes 50 milliliters of water, produces the equivalent of 1.14 grams of carbon dioxide, and consumes the equivalent of 0.2 milligrams of non-renewable resources. Google said "comparative models" typically are a bit more lenient, and only look at the impacts of active TPU and GPU consumption. Put this way, the median Gemini text prompt uses 0.10Wh of energy, consumes 0.12ml of water, and emits the equivalent of 0.02 grams of carbon dioxide. Google did not release any assessments of the impact of training its Gemini models. Mistral did: In January 2025, training its Large 2 model produced the equivalent of 20.4 kilotons of carbon dioxide, consumed 281,000 cubic meters of water, and consumed 650 kilograms of resources. That's about 112 Olympic-sized swimming pools of water consumption. Using the EPA's estimate that an average car produces 4.6 metric tons of carbon dioxide annually, that works out to the annual CO2 production of 4,435 cars, too. The environmental impact assessments assume that energy is produced via means that actually produce carbon dioxide, such as coal. "Clean" energy, like solar, lowers that value. Likewise, the amount of water "consumed" typically assumes the use of evaporative cooling, where heat is transferred from the chip or server (possibly being cooled by water as well) to what's known as an evaporative cooler. The evaporative cooler transfers heat efficiently, in the same manner as your body cools itself after a workout. As you sweat, the moisture evaporates, an endothermic reaction that pulls heat from your body. An evaporative cooler performs the same function, wicking heat from a server farm but also evaporating that water back into the atmosphere. Google said that it uses a holistic approach toward managing energy, such as more efficient models, optimized inferencing though models like Flash-Lite, custom-built TPUs, efficient data centers, and efficient idling of CPUs that aren't being used. Clean energy generation -- such as a planned nuclear reactor -- can help lower the impact numbers, too. "Today, as AI becomes increasingly integrated into every layer of our economy, it is crucial for developers, policymakers, enterprises, governments, and citizens to better understand the environmental footprint of this transformative technology," Mistral's own report adds. "At Mistral AI, we believe that we share a collective responsibility with each actor of the value chain to address and mitigate the environmental impacts of our innovations." The reports from Mistral and Google haven't been duplicated by other companies. EpochAI estimates that the average GPT-4o query on ChatGPT consumes about 0.3Wh of energy, based upon its estimates of the types of servers OpenAI uses. However, the amount of resources AI consumes can vary considerably, and even AI energy scores are rudimentary at best. "In reality, the type and size of the model, the type of output you're generating, and countless variables beyond your control -- like which energy grid is connected to the data center your request is sent to and what time of day it's processed -- can make one query thousands of times more energy-intensive and emissions-producing than another," an MIT Technology Review study found. Its estimates of 15 queries a day plus 10 images plus three 5-second videos would consume 2.9kWh of electricity, it found. Still, Mistral's study authors note that its own estimates point the way toward a "scoring system" where buyers and users could use these studies as a way to choose AI models with the least environmental impact. It also called upon other AI model makers to follow its lead. Whether AI is "bad" for the environment is still up for discussion, but the reports from Google and Mistral provide a foundation for a more reasoned discussion.
[8]
Experts are skeptical about Google's AI water consumption claims
Google says Gemini only uses "five drops" of water per query, but that figure conveniently leaves out other factors. Yesterday, we covered Google's report that a typical query to its Gemini AI consumes only "five drops of water." That figure is now facing criticism from several AI experts, according to The Verge... and that includes one of the authors of one of the reports referred to by Google. AI researcher Shaolei Ren -- a professor at University of California Riverside and one of the authors of the report Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models -- previously estimated that Microsoft's data center consumed 700,000 liters of water to train OpenAI's GPT-3 model. He also calculated that a ChatGPT conversation of 20 to 50 messages can consume close to a pint of water, which is far more than Google's estimate. Ren and other AI researchers argue that Google is wrong to leave out the indirect water consumption of its AI models. Google's figures only mention the water used to cool its data centers while ignoring the water consumption of the power plants that supply the data centers with electricity. "Google's five drops per query is just the tip of the iceberg," says sustainability researcher Alex de Vries-Gao. Experts are also critical of Google's figures on the carbon emissions of AI models. Google is said to have used market-based emissions figures, and can therefore exclude electricity certificates and carbon credits. The figure therefore doesn't show how much CO2 the AI models actually emit.
[9]
Google Breaks Down the Environmental Cost of an AI Prompt -- Is It Really That Tiny?
Google reveals the water, energy and carbon footprint of every Gemini AI prompt For the first time, Google has publicly shared exactly how much energy, water and carbon go into a single AI prompt for Gemini. At first glance, the numbers sound shockingly...small? According to Google's newly released data, the median text prompt to Gemini consumes a mere 0.24 watt-hours of electricity. That's the same amount you'd use watching TV for less than nine seconds (about 0.03 grams of CO₂, and 0.26 milliliters of water, roughly five drops). But, that's now the whole story. In the same report, Google also showed dramatic efficiency gains: a 33x drop in energy consumption and a 44x reduction in carbon footprint per prompt over the past year, all while improving Gemini's response quality. But despite those impressive numbers, the bigger picture is that AI is straining the environment in ways that can't be ignored. A few prompts aren't the problem. While Google does not publicly state how many prompts are queried in a single day, based on usage limits and the amount of models available for free to the public, it can be guesstimated at hundreds of millions of times daily. Although a single Gemini prompt is energy-efficient, multiplied, it starts adding up quickly. Efficiency doesn't eliminate growth. As AI becomes more deeply embedded in our lives, increasing overall consumption and impact on the environment will continue to grow. This is frequently referred to as the Jevons Paradox, an economic concept first described by British economist William Stanley Jevons in 1865. He observed that when coal-burning steam engines became more efficient, coal consumption didn't go down, it actually went up. The same goes for AI. As efficiency improves, we may find consumption of it going up. So instead of efficiency leading to conservation, it can ultimately mean more overall consumption. Google says each Gemini prompt now uses significantly less energy and water than a year ago, which is a huge efficiency gain. But Jevons Paradox warns us: Lower energy per prompt leads to AI feeling "cheap" to use. Cheap use then leads to people and businesses relying on AI more, for everything. More reliance leads to total energy and greater impact on the environment, which could still grow, even faster than before. Infrastructure demands are soaring. Across the U.S., AI-related energy consumption could triple by 2028, potentially boosting electricity prices and putting pressure on the grid; costs that may eventually trickle down to consumers Google's transparency is both an eye opener and a step forward towards managing AI usage. But while the per-prompt footprint seems quite small, the massive scale of global AI usage could still pose environmental consequences. AI is increasingly woven through everything, making its hidden carbon and energy price grow right alongside our usage. It's not one prompt we need to worry about, it's all the prompts to come.
[10]
Google shares how much energy is used for new Gemini AI prompts
Why it matters: The artificial intelligence boom is bringing a surge in power-thirsty data centers, but the energy needs and climate footprint remain a moving and often hazy target. * Google's overall findings are "substantially lower than many public estimates," it said. Driving the news: The tech giant released a detailed methodology that encompasses real-world electricity and water use from deploying AI at scale. * That includes, for instance, energy used by idle chips and data center "overhead" -- that is, equipment such as cooling that's not directly running AI workloads. * Those are two of many factors covered in assessing the "full stack" of AI infrastructure, Google's new paper and blog posts explain. What they found: The median energy use of a single text prompt on Gemini is equivalent to watching TV for less than nine seconds and consumes about five drops of water, the paper finds. * It emits 0.03 grams of carbon dioxide equivalent, Google said. * Better software efficiency and clean energy have lowered the median energy consumption of a Gemini Apps text prompt by 33x over the last year, and the CO2 by 44x, the company said. "As the adage goes, with great power comes great responsibility," Partha Ranganathan, a Google engineer, told reporters this week. * "With great computing power comes great environmental responsibility as well, and so we've been very thoughtful about the environmental impact of this increasing computing demand caused by AI," he said. Yes, but: The new analysis of text prompts doesn't cover video or image generation queries. * Savannah Goodman, Google's head of advanced energy labs, said it's continuously looking to improve transparency. But there's been little consensus on how to measure the impact of even text generation, she said. * "That's really the most consistent request we've gotten. And so we're really starting there with this paper," she told reporters. * The paper also doesn't apply the new methodology to the training of AI models -- a big part of the energy puzzle, though Google has done other research on this. The big picture: Gains in per-query efficiency come as overall AI use is rapidly expanding, and data center energy demand along with it. * Estimates vary. For instance, a late 2024 DOE report projects that data centers could account for 6.7% to 12% of U.S. electricity use by 2028. * Google, in a recent report, said its data center energy emissions fell by 12% in 2024. * But the company's overall emissions were up 11% amid increases in greenhouse gases from its supply chain, including manufacturing and assembling AI computing hardware, and building data centers. What we're watching: Goodman hopes the analysis of "all of the critical factors" will help the industry overall.
[11]
Google reveals just how much energy each Gemini query uses - but is it being entirely truthful?
Google insists these figures represent the average user experience A new study from Google claims its Gemini AI model only uses very minimal water and energy for each prompt - with the median usage sitting at around 5 drops (0.26 milliliters) - the equivalent electricity used for 9 seconds of TV watching (roughly 0.24 watt-hours), resulting in around 0.003 grams of CO2 emissions. Experts have been quick to dispute the claims, however, with The Verge claiming Google omitted key data points in its study, drastically under-reporting the environmental impacts of the model. Whilst models and data centers have become more efficient, it seems there's more to the story than Google is letting on. One of the authors of a paper cited in the study, Shaolei Ren, associate professor of electrical engineering at the University of California told the publication; "They're just hiding the critical information. This really spreads the wrong message to the world." AI models like Gemini are supported by data centers - huge warehouses full of servers which consume intense amounts of water and energy, straining local resources. Governments across the globe have been sanctioning the building of these data centers, despite the destruction they could bring to local countryside - and consumers are likely to be the ones paying for the extra energy used. One of the biggest concerns with Google's study is that it omits indirect water usage in the estimates, which form the majority of the use related to AI. Whilst the figures are technically correct, the missing context of the extreme energy use paints a misleading picture. The study only looks at the water used by data centers to cool their servers, but left out is the electricity these data centers demand, which in turn leads to new gas and nuclear plants - which also cool their systems with water, or use steam to turn turbines. Water isn't the only metric Google misrepresented though, with the paper only outlining a 'market based' carbon emissions measure, which offsets the figure using Google's promises to use renewable energy to support power grids. Savannah Goodman, Head of Advanced Energy Labs told TechRadar Pro, "We hope to share environmental metrics that are representative of a typical user's behavior, and reasonably comparable over time. However, with the rapidly evolving landscape of AI model architectures and AI assistant user behavior, there are outliers either from small subsets of prompts served by models with low utilization or with high token counts." "In order to share metrics that represent a typical user's experience and are robust to this rapidly evolving field, we chose to measure metrics for the median prompt -- which is robust to extreme values and provides a more accurate reflection of a typical prompt's energy impact."
[12]
What's the environmental cost of an AI text prompt? Google says it has an answer.
Aimee Picchi is the associate managing editor for CBS MoneyWatch, where she covers business and personal finance. She previously worked at Bloomberg News and has written for national news outlets including USA Today and Consumer Reports. Amid growing concerns about the environmental impact of artificial intelligence, Google says it has calculated the energy required for its Gemini AI service: Sending a single text prompt consumes as much energy as watching television for nine seconds. The technology giant on Thursday unveiled a new methodology to measure the environmental impact of its AI models, including energy and water consumption as well as carbon emissions. AI tools have the potential to drive economic growth by boosting productivity and unlocking other efficiencies, economists say. By one estimate from Goldman Sachs, the tech is poised to increase global GDP by 7%, or $7 trillion, over 10 years. At the same time, scientists are flagging the outsized environmental impact of AI, which is not yet fully understood even as data centers require enormous amounts of electricity. "In order to improve the energy efficiency of AI, a clear and comprehensive understanding of AI's environmental footprint is important. To date, comprehensive data on the energy and environmental impact of AI inference has been limited," Ben Gomes, Google's senior vice president of learning and sustainability, said in a blog post Thursday. Aside from their electricity needs, AI data centers also require "a great deal of water ... to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can strain municipal water supplies and disrupt local ecosystems," research from MIT shows. "The increasing number of generative AI applications has also spurred demand for high-performance computing hardware, adding indirect environmental impacts from its manufacture and transport." Some new data centers require energy between 100 and 1000 megawatts, roughly equivalent to powering 80,000 to 800,000 homes, according to an April GAO report. For now, however, there are no regulations that require corporations to disclose how much energy or water their AI tools consume. Google said in a technical paper released Thursday by its AI energy and emissions researchers that as adoption of AI tools rises, "so does the need to understand and mitigate the environmental impact of AI serving." Google's new paper on the environmental impact of its own AI tools aims to set a standard for measuring the energy and water consumption as well as carbon emissions of various AI models, the company said. A typical Gemini text query uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters -- or about five drops -- of water. By comparison, the average ChatGPT query uses 0.34 Wh and about one fifteenth of a teaspoon of water, Sam Altman, CEO of ChatGPT-maker OpenAI, has written. Google also outlined the progress it has made in reducing the environmental impact of its Gemini platform. Over a recent 12-month period, the energy consumption and carbon footprint of the median Gemini text prompt decreased by factors of 33x and 44x, respectively, it said. The quality of Gemini's responses also improved over the same time period, the company said.
[13]
'They're just hiding the critical information': Google says its Gemini AI sips a mere 'five drops' of water per text prompt, but experts disagree with its findings
Google has released a new study [PDF warning], detailing what it claims is the environmental impact of "delivering AI at Google scale". The study estimates that a median Gemini AI text prompt uses less energy than watching nine seconds of television, and consumes "0.26 milliliters (or about five drops) of water" -- but some experts have disagreed with its findings. Speaking to The Verge, Shaolei Ren, an associate professor of electrical and computer engineering at the University of California and one of the authors named in the study, said: "They're just hiding the critical information. This really sends the wrong message to the world." Chief among their concerns is Google's apparent omission of indirect water use in its data. While the claim that roughly five drops -- or 0.26 ml -- of water is consumed per median text prompt may be true regarding data center cooling systems, it doesn't take into account the vast amount of water used by power plants providing electricity to the facilities. The International Energy Agency (IEA) estimates that 60% of data center water consumption is from indirect sources, like the water needed to cool power facilities and used to generate steam to spin their turbines. Google claims that its water consumption figure is "orders of magnitude less than previous estimates", referencing a 50 ml figure in Ren's research, but Ren claims that their figure takes into account indirect water consumption as well. And while that 0.26 ml figure might seem like a relatively insignificant amount taken on its own merits, it's worth noting that many modern data centers are often built in the world's driest areas, where even a few drops per query, multiplied by the many, many queries it receives, still makes a significant dent in the local supply. The study also doesn't factor in the cost, either in water usage or C02 production, of training the AI models in the first place. While Google has not released any assessments of these impacts, Mistral, a French AI startup, has previously reported that training its Large 2 model resulted in the production of an estimated 20.4 kilotons of carbon dioxide and consumed 281,000 cubic meters of water, equivalent to around 112 Olympic-sized swimming pools. Speaking of carbon emissions, both Ren and Alex de Vries-Gao, another research author cited in the study, also disagree with the way renewable energy was factored into the estimates. The experts claim the paper uses a generalised, market-based measure of C02 production, rather than looking specifically at the locations where Google's data centers draw their energy from. Both argue that Google should have taken into account the specific mix of renewable and "dirty" energy supplying its facilities, rather than factoring in commitments made by power companies to support renewable energy growth on the power grid overall with a market-based metric. Ren and de Vries-Gao also argue that their previous work referenced in the study was incorrectly represented, given that Google compared its own figures based on median prompts to their data, which was based on averages, and Google did not share figures for how it arrived at said median. The figures also don't take into account video or image generation, both of which Google's Gemini AI provides. It's not just Google claiming tiny amounts of water consumption in its data centers, either. OpenAI CEO Sam Altman claimed earlier this year that a typical ChatGPT query uses one 15th of a teaspoon of water, which does seem like a relatively insignificant amount, even when you multiply it at scale to account for billions of potential prompts. However, much like Google's claims, looking at individual metrics and using them as an indicator of AI's environmental impact overall seems to me to be an oversimplification of a very complicated issue. While efficiencies appear to be being made in regards to AI's water consumption and power usage, as soon as you factor in the sheer scale of data center expansion, and the apparent power gap it's expanding into, it starts to look like much less rosy of a picture.
[14]
A Google Gemini prompt uses as much energy as 9 seconds of TV
Technology companies have been reserved about the environmental impact of their AI models. Running a single prompt on Google's artificial intelligence (AI) chatbot Gemini takes about the same amount of energy as watching 9 seconds of TV, according to a new technical paper. Google says a text prompt through Gemini uses about 0.24 watt-hours (Wh) of energy, according to a blog post from the company. The energy emitted from using AI is about 0.03 grams of carbon dioxide. The AI chatbot also consumes 0.26 millilitres of water, or roughly five drops of water per prompt. The company measured not only the energy and water use of the individual prompt, but also the IT equipment in data centres, the idle power of the chips that power the AI model, and the water used by data centres to cool the equipment being used. The company claimed that its energy and water usage for AI is "substantially lower than many public estimates". One study by the non-profit Electric Power Research Institute estimated that a prompt issued to OpenAI's ChatGPT uses 2.9 watt-hours of energy while a traditional search uses about 0.3 watt-hours. The International Energy Agency (IEA) predicts that energy demand will double over the next five years to 945 terawatt hours per year - or the entire electricity consumption of Japan. Google's own emissions are up too, by 51 per cent since 2019, according to their latest environmental report. This is largely due to emissions caused by the manufacturing and assembly of hardware needed for AI further down the supply chain. The company said that other studies about the energy use of AI often measure only machine consumption, which "overlook[s]" some of the other factors that Google used in their metrics, meaning their numbers represent "theoretical efficiency instead of true operating efficiency at scale," the blog post said. Google claims that its AI models are also becoming more efficient, with energy use and carbon footprint numbers per prompt dropping by 33 times and 44 times since August 2024. One of the study's shortfalls is that it does not reveal the total number of queries that Gemini gets every day, which would give a better indication of Gemini's total energy demand.
[15]
Google did the math on AI's energy footprint
Ever wonder how much energy it takes when you ask an AI to draft an email, plan a vacation, or role-play as a friend? Google just shed some light: The company calculated the energy, water, and carbon emissions tied to text prompts on its Gemini apps. The energy use for the median text prompt is far less than some previous estimates: Google found that it uses just 0.24 watt-hours, or the equivalent of watching TV for less than nine seconds. Other models might be wildly different: One estimate of OpenAI's GPT-5 model suggested that it might use as much as 40 watt-hours per prompt. (That estimate, unlike Google's calculations, wasn't based on internal data.) Google laid out a detailed methodology for measuring the environmental impact of getting answers from its AI. For energy, that includes the electricity burned as the system converts your text prompt into a form it can process, calculates all the probabilities for which word could come next, and decodes the result into readable text. It also includes the energy for cooling and other infrastructure needed at a data center, and for keeping idle machines running in case demand spikes. The carbon footprint of the median prompt was 0.03 grams of CO2, based on the energy mix for each grid and Google's own investments in renewable energy. Each prompt also uses 0.26 milliliters, or about five drops, of water.
[16]
Google details Gemini apps' environmental footprint in new paper - SiliconANGLE
Google details Gemini apps' environmental footprint in new paper Google LLC has released technical data about the environment footprint of its Gemini artificial intelligence apps. The company shared the information in a paper published today. According to Google, the median text prompt to a Gemini app uses 0.24Wh, or 0.24 watt-hours, of energy. That's enough to power a highly efficient lightbulb for more than a minute. Prompts that ask Gemini to generate multimodal content or analyze a large amount of data use considerably more energy. Gemini is powered by Google's TPU series of custom artificial intelligence processors. The company detailed that those chips account for 58% of the 0.24Wh used by a median Gemini text query. It's unclear which specific TPU models were used in the evaluation. Google's newest TPU chip, which debuted earlier this year, offers more than 30 times better energy efficiency than the first processor in the series. It's deployed in water-cooled server clusters that can host up to 9,216 TPUs. Google says that each such cluster consumes nearly 10 megawatts of electricity, which is enough to power a few thousand homes. While measuring Gemini's power usage, the company accounted for fluctuations in the energy consumption of TPUs. "This includes not just the energy and water used by the primary AI model during active computation, but also the actual achieved chip utilization at production scale, which can be much lower than theoretical maximums," Google executives Amin Vahdat and Jeff Dean wrote in a blog post. A quarter of the power used by the median Gemini prompt goes to the servers in which TPUs are deployed. In particular, that power is consumed by the machines' central processing units and memory chips. Some of the TPU servers in Google's data centers are idle. Those servers are not offline, but rather operate in a low-power mode that allows the search giant to quickly activate them if an AI cluster requires additional capacity. Google says the idle machines account for 10% of median Gemini prompts' power consumption. Auxiliary data center infrastructure accounts for the remaining energy usage. Google says that this equipment includes, among others, cooling and power distribution systems.. The company's newly published paper also contains other data points about Gemini's environmental footprint. Google researchers estimate that the median text prompt causes its equipment to emit 0.03 grams of carbon dioxide. In the process, each such request consumes 0.26 milliliters of water, which is equivalent to about five drops. "Over a 12-month period, while delivering higher-quality responses, the median energy consumption and carbon footprint per Gemini Apps text prompt decreased by factors of 33x and 44x," Ben Gomes, the chief technologist of Google's Learning & Sustainability group, wrote in a blog post.
[17]
AI's power drain: Google details Gemini apps' environmental footprint in new paper - SiliconANGLE
AI's power drain: Google details Gemini apps' environmental footprint in new paper Google LLC has released technical data about the environment footprint of its Gemini artificial intelligence apps. The company shared the information in a paper published today. According to Google, the median text prompt to a Gemini app uses 0.24Wh, or 0.24 watt-hours, of energy. That's enough to power a highly efficient lightbulb for more than a minute. Prompts that ask Gemini to generate multimodal content or analyze a large amount of data use considerably more energy. Gemini is powered by Google's TPU series of custom artificial intelligence processors. The company detailed that those chips account for 58% of the 0.24Wh used by a median Gemini text query. It's unclear which specific TPU models were used in the evaluation. Google's newest TPU chip, which debuted earlier this year, offers more than 30 times better energy efficiency than the first processor in the series. It's deployed in water-cooled server clusters that can host up to 9,216 TPUs. Google says that each such cluster consumes nearly 10 megawatts of electricity, which is enough to power a few thousand homes. While measuring Gemini's power usage, the company accounted for fluctuations in the energy consumption of TPUs. "This includes not just the energy and water used by the primary AI model during active computation, but also the actual achieved chip utilization at production scale, which can be much lower than theoretical maximums," Google executives Amin Vahdat and Jeff Dean wrote in a blog post. A quarter of the power used by the median Gemini prompt goes to the servers in which TPUs are deployed. In particular, that power is consumed by the machines' central processing units and memory chips. Some of the TPU servers in Google's data centers are idle. Those servers are not offline, but rather operate in a low-power mode that allows the search giant to activate them quickly if an AI cluster requires additional capacity. Google says the idle machines account for 10% of median Gemini prompts' power consumption. Auxiliary data center infrastructure accounts for the remaining energy usage. Google says that this equipment includes, among others, cooling and power distribution systems.. The company's newly published paper also contains other data points about Gemini's environmental footprint. Google researchers estimate that the median text prompt causes its equipment to emit 0.03 grams of carbon dioxide. In the process, each such request consumes 0.26 milliliters of water, which is equivalent to about five drops. "Over a 12-month period, while delivering higher-quality responses, the median energy consumption and carbon footprint per Gemini Apps text prompt decreased by factors of 33x and 44x," Ben Gomes, chief technologist of Google's Learning & Sustainability group, wrote in a blog post.
[18]
Google Claims Gemini's AI Prompt Uses 'Less Energy Than 9 Seconds of TV' | AIM
Each prompt also produces 0.03 grams of CO₂ equivalent and uses 0.26 millilitres of water or "about five drops". In a landmark study released by Google on August 22, titled 'Measuring the Environmental Impact of Delivering AI at Google Scale', the company outlines a "first-of-its-kind full-stack methodology" to measure the footprint of AI inference. Rather than focusing only on the power drawn by chips, the paper accounts for "AI accelerators, CPUs, RAM, idle machines and data centre overheads such as cooling and power distribution." The results may surprise even AI's fiercest critics. According to the paper, the median Gemini text prompt consumes just 0.24 watt-hours of energy, which "represents less energy than watching TV for nine seconds". Each prompt also produces 0.03 grams of CO₂ equivalent (gCO₂e) and uses 0.26 millilitres of water or "about five drops".
[19]
Google Says a Gemini AI Prompt Costs 9 Seconds of TV Energy and 5 Drops of Water
On Thursday, Google released a technical paper detailing the environmental impact of its Gemini AI chatbot. In the paper, Google says the median Gemini text prompt uses 0.24Wh of energy, which is equivalent to watching TV for less than 9 seconds. The paper also mentions that each AI prompt in Gemini consumes 0.26 milliliters of water (about five drops) and emits 0.03 grams of carbon dioxide. Google goes on to say that the environmental impact is substantially lower than many public estimates. Moreover, according to the paper, "over a recent 12 month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively." Google's chief scientist, Jeff Dean, shared a post on X, saying that AI efficiency was achieved "through a combination of model efficiency improvements, machine utilization improvements and additional clean energy procurement, all while delivering higher quality responses." Back in June, OpenAI CEO Sam Altman revealed ChatGPT's energy bill and said that an average ChatGPT query uses about 0.34 watt-hours, closer to what "an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes." Altman further mentioned that a ChatGPT query uses "about 0.000085 gallons of water; roughly one fifteenth of a teaspoon." He also noted that "the cost of intelligence should eventually converge to near the cost of electricity." While Google's Gemini AI stack appears more efficient than ChatGPT and offers an optimistic picture, critics argue that AI's environmental cost is far higher. Google has focused on the inference energy data, but training large-scale AI models on massive GPU clusters also consumes vastly more energy. Put simply, the debate over AI's impact on the environment is far from settled.
[20]
Google's New Study Claims Gemini AI is Eco-Friendly, Only Uses 5 Drops of Water
Google says Gemini AI Uses Only 5 drops of Water and 9 Seconds of TV Power Per Prompt. Experts Call It Misleading! Efficiency or Greenwash? Google has released a new report about the environmental cost of Gemini AI. The company says a single Gemini text prompt requires only 0.26 mL of water, which is equivalent to around five drops, and the same electricity as running a television for nine seconds. It also claims the process produces just 0.03 grams of carbon dioxide. According to the tech giant, these numbers prove that Google Gemini has become much more efficient over the past year. The company says energy use per prompt has dropped 33 times, and the carbon footprint has fallen 44 times. It highlights this as progress in making artificial intelligence more sustainable.
[21]
ChatGPT to Gemini: How much energy do your AI queries cost?
Text prompts use little energy, multimodal AI queries demand far more In the race to build smarter, faster, and more capable AI, another competition is playing out quietly in the background: energy efficiency. Each chatbot query may feel weightless when you type it out, but behind the screen, servers are buzzing, GPUs are firing, and datacenters are drawing power. And as AI becomes a daily tool for millions, those watt-hours add up to something much bigger than a single conversation. A new wave of research and disclosures in 2025 has finally shed light on just how much electricity the world's leading chatbots consume per query and the differences are stark. Also read: 5 Ways India is Helping the Environment: From Net Zero to Beyond Plastic According to the latest estimates: For context, a Google search consumes about 0.3 Wh, while streaming a minute of video on YouTube can use 1-3 Wh. That means most advanced chatbots are now on par with or only slightly above a search query, a remarkable leap in efficiency compared to just two years ago, when estimates for ChatGPT queries were as high as 3 Wh each. But there's a catch: these numbers generally reflect short, text-only prompts. Once you shift into image generation, video synthesis, or other multimodal tasks, energy usage spikes dramatically, sometimes into the tens of watt-hours for a single query. Also read: Claude is taking AI model welfare to amazing levels: Here's how The sharp differences reflect not just the size of models, but also infrastructure, hardware optimization, and energy strategy. A single half-watt text query may not seem like much, but scale it up and the stakes become clear. Billions of daily prompts translate into megawatt-hours of electricity use. For instance, one million queries to Grok could consume as much energy as powering a U.S. household for a year. And that's before factoring in the heavy-duty side of AI: image generation, code compilation, or video outputs can multiply energy use by 10, 20, or even 100 times compared to a simple text exchange. The "per query" figures most companies share are therefore only part of the story. The latest figures suggest a trend: the leading chatbots are converging around 0.3 Wh per text query, a massive improvement from just a couple of years ago. But the outliers show the risks, not every AI company is building with efficiency at the forefront. As users, we may not feel the watt-hours behind our questions. But collectively, the power draw of generative AI is shaping climate debates, infrastructure investments, and even regulatory conversations. The question now is simple: can AI's energy appetite keep shrinking, even as its capabilities keep expanding into image, video, and beyond? Because from ChatGPT to Gemini, Claude to Grok, every prompt we type or picture we generate is part of a much bigger power equation.
Share
Share
Copy Link
Google has released a comprehensive analysis of the energy consumption and environmental impact of its AI queries, revealing a 33x reduction in energy use over the past year. This unprecedented disclosure provides valuable insights into the real-world energy costs of AI operations.
In an industry first, Google has released a comprehensive analysis of the energy consumption and environmental impact of its AI queries, specifically for its Gemini model. This unprecedented disclosure provides valuable insights into the real-world energy costs of AI operations, addressing growing concerns about the technology's environmental footprint
1
.Source: CNET
The most striking revelation from Google's report is the dramatic decrease in energy consumption for AI queries. Over a recent 12-month period, the energy footprint of the median Gemini Apps text prompt dropped by a factor of 33, while the total carbon footprint decreased by 44 times
2
. This substantial improvement demonstrates the rapid pace of efficiency gains in AI technology.Google's approach to measuring AI's impact is notably more comprehensive than previous estimates. The company's methodology includes:
This holistic approach provides a more accurate picture of AI's resource requirements compared to estimates that only consider active machine consumption
3
.Source: Analytics India Magazine
According to Google's analysis, the median Gemini text prompt uses:
To put this into perspective, Google equates the energy use to about nine seconds of TV viewing
4
.Several factors have contributed to the significant reduction in energy consumption:
5
.Source: Ars Technica
Related Stories
While Google's disclosure is a significant step towards transparency in AI energy consumption, it's important to note that these figures represent just one company's AI model. The broader industry impact remains unclear, as other major AI players like OpenAI, Anthropic, and Meta have yet to release similarly detailed data.
As AI adoption continues to grow rapidly, with an estimated 30% of Americans now actively using AI, the cumulative energy impact of millions of daily queries could still be substantial. This underscores the importance of ongoing efforts to improve AI efficiency and the use of renewable energy sources in data centers.
Google's revelation of its AI energy consumption data marks a crucial milestone in understanding and addressing the environmental impact of artificial intelligence. While the significant reduction in energy use per query is encouraging, the growing scale of AI adoption means that continued focus on efficiency and sustainability will be essential for the industry moving forward.
Summarized by
Navi
[2]