7 Sources
[1]
In a first, Google has released data on how much energy an AI prompt uses
It's the most transparent estimate yet from a Big Tech company with a popular AI product, and the report includes detailed information about how the company calculated its final estimate. As AI has become more widely adopted, there's been a growing effort to understand its energy use. But public efforts attempting to directly measure the energy used by AI have been hampered by a lack of full access to the operations of a major tech company. Earlier this year, MIT Technology Review published a comprehensive series on AI and energy, at which time none of the major AI companies would reveal their per-prompt energy usage. Google's new publication, at last, allows for a peek behind the curtain that researchers and analysts have long hoped for. The study focuses on a broad look at energy demand, including not only the power used by the AI chips that run models but also by all the other infrastructure needed to support that hardware. "We wanted to be quite comprehensive in all the things we included," said Jeff Dean, Google's chief scientist, in an exclusive interview with MIT Technology Review about the new report. That's significant, because in this measurement, the AI chips -- in this case, Google's custom TPUs, the company's proprietary equivalent of GPUs -- account for just 58% of the total electricity demand of 0.24 watt-hours. Another large portion of the energy is used by equipment needed to support AI-specific hardware: The host machine's CPU and memory account for another 25% of the total energy used. There's also backup equipment needed in case something fails -- these idle machines account for 10% of the total. The final 8% is from overhead associated with running a data center, including cooling and power conversion. This sort of report shows the value of industry input to energy and AI research, says Mosharaf Chowdhury, a professor at the University of Michigan and one of the heads of the ML.Energy leaderboard, which tracks energy consumption of AI models. Estimates like Google's are generally something that only companies can produce, because they run at a larger scale than researchers are able to and have access to behind-the-scenes information. "I think this will be a keystone piece in the AI energy field," says Jae-Won Chung, a PhD candidate at the University of Michigan and another leader of the ML.Energy effort. "It's the most comprehensive analysis so far." Google's figure, however, is not representative of all queries submitted to Gemini: The company handles a huge variety of requests, and this estimate is calculated from a median energy demand, one that falls in the middle of the range of possible queries.
[2]
How Much Energy Do Your AI Prompts Consume? Google Just Shared Its Gemini Numbers
The explosion in use of AI tools across the world is increasing exponentially, but the environmental impact isn't expressed in detail often by the companies that make these tools. But Google has just released a technical paper detailing measurements for energy, emissions and water use of its Gemini AI prompts -- and the impact of a single prompt is, it says, minuscule. According to its methodology for measuring AI's impact, a single prompt's energy consumption is about the equivalent of watching TV for less than 9 seconds. That is quite low, but consider the variety of chatbots that are used, and that billions of prompts are easily sent every day. The good news is that the technology behind these prompts has become more efficient in the past 12 months. Google says that the energy of a single Gemini text prompt has reduced by 33x and total carbon footprint has reduced by 44x. That's not unsubstantial, and that type of momentum will need to be maintained going forward, Google says. Google did not immediately respond to CNET's request for further comment. The search giant says the typical calculation for the energy cost of an AI prompt ends at the active machine it's been run on, which shows a much smaller per-prompt footprint. But Google's method for measuring the impact of a prompt spans a much wider range of factors that paint a clearer picture, including full system dynamic power, idle machines, data center overhead, water consumption and more. For comparison, it's estimated that only using only the active TPU and GPU consumption, a single Gemini prompt uses 0.10Wh of energy, 0.12mL of water and emits 0.02 gCO2e. This is a minute and promising number, but Google's wider methodology tells a different story. With more considerations in place, a Gemini text prompt uses 0.24Wh of energy, 0.26mL of water and emits 0.03 gCO2e -- almost more than double across the board. Through a multi-layered series of efficiencies, Google is continually working on ways to make AI's impact grow smaller. From more efficient model architectures and data centers to custom hardware, Google's approach to addressing AI's impact on the world is a full-stack one. With smarter models, use cases and tools coming out by the day, those efficiencies will be much needed everywhere as we are steeped further into our new AI reality.
[3]
Google reveals how much energy a Gemini query uses - in industry first
Estimates are lower than public calculations, but industry-wide usage is still unclear. AI demand is rapidly accelerating, which means the infrastructure that makes it possible -- data centers and the power plants that supply them -- is expanding, too. The lack of concrete data around exactly how much energy AI uses has created concern and debate about how that demand is impacting the environment. New data from Google hopes to change that. Also: How much energy does AI really use? The answer is surprising - and a little complicated In an industry first, the company published estimates on its Gemini chatbot's energy usage and emissions. The average Gemini text prompt uses "0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water," Google said Thursday, comparing the per-prompt impact to "watching TV for less than nine seconds." Of course, that's just one average prompt. Google estimated Gemini had 350 million monthly users in March (almost half of ChatGPT user estimates); depending on how many are querying Gemini at any given moment, what enterprise clients are using the chatbot for, and power users sending more complex prompts, those seconds can add up. Google published a framework for tracking the emissions, energy, and water use of its Gemini apps, saying its findings are "substantially lower than many public estimates" of the resources AI consumes. For more details on the findings, keep reading. Google started publishing information on its global data center electricity usage in 2020, and provides annual reports on the Power Usage Effectiveness (PUE) of its data centers going back to 2008. Though Google did not publish its raw AI energy data, it is the first tech company to release granular reporting on the subject. Also: How web scraping actually works - and why AI changes everything In June, after alarming claims about how resource- and water-intensive ChatGPT use is circulated on social media, OpenAI CEO Sam Altman wrote in a blog that the average ChatGPT query uses "about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes." He added that a query uses "roughly one fifteenth of a teaspoon" of water, but did not provide methodology or data to support either statement. While reporting indicates Meta's data centers are using huge amounts of water, none of AI's major players themselves, including Anthropic, have shared specifics. According to Google, some AI resource calculations "only include active machine consumption" or focus solely on the inference cost of models, ignoring crucial factors that can make an AI system function more efficiently, and therefore with a smaller footprint. For example, larger reasoning models need more compute than smaller ones; to improve efficiency, approaches like speculative decoding (which Google uses) let fewer chips address more queries by having a smaller model make predictions that a larger model then verifies, as opposed to the larger model handling the entire process. In response, Google developed its own methodology for the report, taking into account several components that it said are often overlooked. In its testing, Google said it tracked not just the energy and water used by the model actively computing, but how chips are actually used at scale, which it said "can be much lower than theoretical maximums." Also: 30% of Americans are now active AI users, says new ComScore data The company monitored energy used beyond the TPUs and GPUs that AI runs on, factoring in host CPU and RAM as well, to ensure all components that contribute to an AI query were accounted for. It also included the energy used by "idle machines," or systems that must be on standby even when not actively computing to handle usage spikes, alongside infrastructure that's always in use, even for non-AI computation, like data center overhead, cooling systems, and water consumption. Google said it compared a "non-comprehensive" approach to its own: the former estimated that "the median Gemini text prompt uses 0.10 Wh of energy, emits 0.02 gCO2e, and consumes 0.12 mL of water" -- numbers Google said "substantially" underestimated Gemini's footprint and were "optimistic" at best. Its own methodology, on the other hand, showed higher estimates: 0.24 Wh, 0.03 gCO2e, and 0.26 mL of water comparatively. "We believe this is the most complete view of AI's overall footprint," Google said. Also: Cisco reveals its AI-ready data center strategy - boosted by its Nvidia partnership Despite revealing higher numbers, Google still said AI energy usage has been overhyped. "The energy consumption, carbon emissions, and water consumption were actually a lot lower than what we've been seeing in some of the public estimates," said Savannah Goodman, head of Google's advanced energy labs, in a video shared with ZDNET. Goodman did not cite specific estimates for comparison. The company said that "over a recent 12-month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively, all while delivering higher quality responses." However, Google added that neither the data nor the claims had been vetted by a third party. Google cited several approaches it's implementing in data centers to improve efficiency overall, which it says will decrease its AI emissions footprint. These include maximizing hardware performance, using hybrid reasoning, and distillation, or having larger models teach smaller ones. Google also reiterated commitments to using clean energy sources and replenishing the freshwater it uses for cooling. Also: Stop using AI for these 9 work tasks - here's why While the company's data center emissions might be down 12%, its latest sustainability report, released in June, showed Google's energy usage has more than doubled in just four years. Its data pertaining to Gemini appears less alarming than many other AI usage estimates out there, but that shouldn't be treated as evidence that Google is below energy usage norms for the tech industry, or is making larger-scale cuts -- especially given how popular Gemini is with users on a daily basis. As AI expands, energy efficiency has been top of mind for many -- but growth is happening too fast for environmental concerns to land. Reporting indicates AI demands are driving electricity and related resource use, which makes it an important component of our environmental future. A recent Reuters/Ipsos poll showed 61% of Americans are concerned about AI electricity use. Last month, President Trump pledged $92 billion toward AI infrastructure in Pennsylvania, an extension of the $500 billion Stargate initiative he announced shortly after taking office in January, alongside several companies, including OpenAI. The Trump administration's AI Action Plan, released last month, clarified intentions to "reject radical climate dogma," reduce regulations, and "expedite environmental permitting" for new data centers and power plants. Also: How the Trump administration changed AI: A timeline That said, if applied correctly, AI could also help curb emissions and create sustainable energy futures that could mitigate the impact of climate change. The more data the public has on AI's impact, the better it can advocate for sustainable applications. More metric sharing -- especially when company data finally gets vetted by independent third parties -- could create industry standards and competitive incentives for users and businesses to take emissions and energy use into account when selecting a model. Ideally, Google's report incentivizes other companies to share similar information on their own AI systems. While Google's numbers might give individual users some relief that their handful of queries isn't using an entire bottle of potable water, they can't be considered in a vacuum. As AI use goes up, these numbers will only continue to compound, unless data center infrastructure invests seriously in renewable energy sources -- a process experts say could be deprioritized given the rapid pace of the industry and the Trump administration's priorities.
[4]
How much power and water does AI use? Google, Mistral weigh in
The answers Google and Mistral provided offer just a basic estimate of the resources AI consumes. Still, it's a start. How badly does AI harm the environment? We now have some answers to that question, as both Google and Mistral have published their own self-assessments of the environmental impact of an AI query. In July, Mistral, which publishes its own AI models, published a self-evaluation of the environmental impact of training and querying its model in terms of the amount of carbon dioxide (CO2) produced, the amount of water consumed, and the amount of material consumed. Google took a slightly different approach, publishing the amount of power and water a Gemini query consumes, as well as how much CO2 it produces. Of course, there are caveats: Each report was self-generated, and not performed by an outside auditor. Also, training a model consumes vastly more resources than inferencing, or the day-to-day tasks users assign a chatbot each time they query it. Still, the reports provide some context for how much AI taxes the environment, even though they exclude the effects of AI training and inferencing by OpenAI and other competitors. On Thursday, Google said its estimate for the resources consumed by a "median" Gemini query consumes 0.24Wh of energy and 0.26 milliliters (five drops) of water, and generates the equivalent of 0.03 grams of carbon dioxide -- the equivalent of 9 seconds of watching TV. Mistral's report slightly differed: For a "Le Chat" response generating a page of text (400 tokens), Mistral consumes 50 milliliters of water, produces the equivalent of 1.14 grams of carbon dioxide, and consumes the equivalent of 0.2 milligrams of non-renewable resources. Google said "comparative models" typically are a bit more lenient, and only look at the impacts of active TPU and GPU consumption. Put this way, the median Gemini text prompt uses 0.10Wh of energy, consumes 0.12ml of water, and emits the equivalent of 0.02 grams of carbon dioxide. Google did not release any assessments of the impact of training its Gemini models. Mistral did: In January 2025, training its Large 2 model produced the equivalent of 20.4 kilotons of carbon dioxide, consumed 281,000 cubic meters of water, and consumed 650 kilograms of resources. That's about 112 Olympic-sized swimming pools of water consumption. Using the EPA's estimate that an average car produces 4.6 metric tons of carbon dioxide annually, that works out to the annual CO2 production of 4,435 cars, too. The environmental impact assessments assume that energy is produced via means that actually produce carbon dioxide, such as coal. "Clean" energy, like solar, lowers that value. Likewise, the amount of water "consumed" typically assumes the use of evaporative cooling, where heat is transferred from the chip or server (possibly being cooled by water as well) to what's known as an evaporative cooler. The evaporative cooler transfers heat efficiently, in the same manner as your body cools itself after a workout. As you sweat, the moisture evaporates, an endothermic reaction that pulls heat from your body. An evaporative cooler performs the same function, wicking heat from a server farm but also evaporating that water back into the atmosphere. Google said that it uses a holistic approach toward managing energy, such as more efficient models, optimized inferencing though models like Flash-Lite, custom-built TPUs, efficient data centers, and efficient idling of CPUs that aren't being used. Clean energy generation -- such as a planned nuclear reactor -- can help lower the impact numbers, too. "Today, as AI becomes increasingly integrated into every layer of our economy, it is crucial for developers, policymakers, enterprises, governments, and citizens to better understand the environmental footprint of this transformative technology," Mistral's own report adds. "At Mistral AI, we believe that we share a collective responsibility with each actor of the value chain to address and mitigate the environmental impacts of our innovations." The reports from Mistral and Google haven't been duplicated by other companies. EpochAI estimates that the average GPT-4o query on ChatGPT consumes about 0.3Wh of energy, based upon its estimates of the types of servers OpenAI uses. However, the amount of resources AI consumes can vary considerably, and even AI energy scores are rudimentary at best. "In reality, the type and size of the model, the type of output you're generating, and countless variables beyond your control -- like which energy grid is connected to the data center your request is sent to and what time of day it's processed -- can make one query thousands of times more energy-intensive and emissions-producing than another," an MIT Technology Review study found. Its estimates of 15 queries a day plus 10 images plus three 5-second videos would consume 2.9kWh of electricity, it found. Still, Mistral's study authors note that its own estimates point the way toward a "scoring system" where buyers and users could use these studies as a way to choose AI models with the least environmental impact. It also called upon other AI model makers to follow its lead. Whether AI is "bad" for the environment is still up for discussion, but the reports from Google and Mistral provide a foundation for a more reasoned discussion.
[5]
Google shares how much energy is used for new Gemini AI prompts
Why it matters: The artificial intelligence boom is bringing a surge in power-thirsty data centers, but the energy needs and climate footprint remain a moving and often hazy target. * Google's overall findings are "substantially lower than many public estimates," it said. Driving the news: The tech giant released a detailed methodology that encompasses real-world electricity and water use from deploying AI at scale. * That includes, for instance, energy used by idle chips and data center "overhead" -- that is, equipment such as cooling that's not directly running AI workloads. * Those are two of many factors covered in assessing the "full stack" of AI infrastructure, Google's new paper and blog posts explain. What they found: The median energy use of a single text prompt on Gemini is equivalent to watching TV for less than nine seconds and consumes about five drops of water, the paper finds. * It emits 0.03 grams of carbon dioxide equivalent, Google said. * Better software efficiency and clean energy have lowered the median energy consumption of a Gemini Apps text prompt by 33x over the last year, and the CO2 by 44x, the company said. "As the adage goes, with great power comes great responsibility," Partha Ranganathan, a Google engineer, told reporters this week. * "With great computing power comes great environmental responsibility as well, and so we've been very thoughtful about the environmental impact of this increasing computing demand caused by AI," he said. Yes, but: The new analysis of text prompts doesn't cover video or image generation queries. * Savannah Goodman, Google's head of advanced energy labs, said it's continuously looking to improve transparency. But there's been little consensus on how to measure the impact of even text generation, she said. * "That's really the most consistent request we've gotten. And so we're really starting there with this paper," she told reporters. * The paper also doesn't apply the new methodology to the training of AI models -- a big part of the energy puzzle, though Google has done other research on this. The big picture: Gains in per-query efficiency come as overall AI use is rapidly expanding, and data center energy demand along with it. * Estimates vary. For instance, a late 2024 DOE report projects that data centers could account for 6.7% to 12% of U.S. electricity use by 2028. * Google, in a recent report, said its data center energy emissions fell by 12% in 2024. * But the company's overall emissions were up 11% amid increases in greenhouse gases from its supply chain, including manufacturing and assembling AI computing hardware, and building data centers. What we're watching: Goodman hopes the analysis of "all of the critical factors" will help the industry overall.
[6]
What's the environmental cost of an AI text prompt? Google says it has an answer.
Aimee Picchi is the associate managing editor for CBS MoneyWatch, where she covers business and personal finance. She previously worked at Bloomberg News and has written for national news outlets including USA Today and Consumer Reports. Amid growing concerns about the environmental impact of artificial intelligence, Google says it has calculated the energy required for its Gemini AI service: Sending a single text prompt consumes as much energy as watching television for nine seconds. The technology giant on Thursday unveiled a new methodology to measure the environmental impact of its AI models, including energy and water consumption as well as carbon emissions. AI tools have the potential to drive economic growth by boosting productivity and unlocking other efficiencies, economists say. By one estimate from Goldman Sachs, the tech is poised to increase global GDP by 7%, or $7 trillion, over 10 years. At the same time, scientists are flagging the outsized environmental impact of AI, which is not yet fully understood even as data centers require enormous amounts of electricity. "In order to improve the energy efficiency of AI, a clear and comprehensive understanding of AI's environmental footprint is important. To date, comprehensive data on the energy and environmental impact of AI inference has been limited," Ben Gomes, Google's senior vice president of learning and sustainability, said in a blog post Thursday. Aside from their electricity needs, AI data centers also require "a great deal of water ... to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can strain municipal water supplies and disrupt local ecosystems," research from MIT shows. "The increasing number of generative AI applications has also spurred demand for high-performance computing hardware, adding indirect environmental impacts from its manufacture and transport." Some new data centers require energy between 100 and 1000 megawatts, roughly equivalent to powering 80,000 to 800,000 homes, according to an April GAO report. For now, however, there are no regulations that require corporations to disclose how much energy or water their AI tools consume. Google said in a technical paper released Thursday by its AI energy and emissions researchers that as adoption of AI tools rises, "so does the need to understand and mitigate the environmental impact of AI serving." Google's new paper on the environmental impact of its own AI tools aims to set a standard for measuring the energy and water consumption as well as carbon emissions of various AI models, the company said. A typical Gemini text query uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters -- or about five drops -- of water. By comparison, the average ChatGPT query uses 0.34 Wh and about one fifteenth of a teaspoon of water, Sam Altman, CEO of ChatGPT-maker OpenAI, has written. Google also outlined the progress it has made in reducing the environmental impact of its Gemini platform. Over a recent 12-month period, the energy consumption and carbon footprint of the median Gemini text prompt decreased by factors of 33x and 44x, respectively, it said. The quality of Gemini's responses also improved over the same time period, the company said.
[7]
Google did the math on AI's energy footprint
Ever wonder how much energy it takes when you ask an AI to draft an email, plan a vacation, or role-play as a friend? Google just shed some light: The company calculated the energy, water, and carbon emissions tied to text prompts on its Gemini apps. The energy use for the median text prompt is far less than some previous estimates: Google found that it uses just 0.24 watt-hours, or the equivalent of watching TV for less than nine seconds. Other models might be wildly different: One estimate of OpenAI's GPT-5 model suggested that it might use as much as 40 watt-hours per prompt. (That estimate, unlike Google's calculations, wasn't based on internal data.) Google laid out a detailed methodology for measuring the environmental impact of getting answers from its AI. For energy, that includes the electricity burned as the system converts your text prompt into a form it can process, calculates all the probabilities for which word could come next, and decodes the result into readable text. It also includes the energy for cooling and other infrastructure needed at a data center, and for keeping idle machines running in case demand spikes. The carbon footprint of the median prompt was 0.03 grams of CO2, based on the energy mix for each grid and Google's own investments in renewable energy. Each prompt also uses 0.26 milliliters, or about five drops, of water.
Share
Copy Link
Google releases the first comprehensive report on the energy usage of its Gemini AI model, providing unprecedented transparency in the tech industry and sparking discussions about AI's environmental impact.
In an industry first, Google has released a comprehensive report detailing the energy consumption, carbon emissions, and water usage of its Gemini AI model 1. This unprecedented move provides researchers and analysts with the long-awaited insight into the environmental impact of large-scale AI operations.
Source: CNET
Google's approach to measuring AI's impact is notably thorough, encompassing factors often overlooked in previous estimates. The company's methodology includes:
This holistic view provides a more accurate representation of AI's true environmental footprint 2.
According to Google's report, the median Gemini text prompt consumes:
These figures are equivalent to watching television for less than nine seconds 3.
Google reports significant progress in reducing the environmental impact of its AI operations:
These improvements are attributed to more efficient model architectures, optimized data centers, and custom hardware 2.
Source: PCWorld
The release of this data is expected to have far-reaching effects on the AI industry:
Jeff Dean, Google's chief scientist, emphasized the company's commitment to comprehensive reporting, stating, "We wanted to be quite comprehensive in all the things we included" 1.
While the per-prompt energy usage appears minimal, it's crucial to consider the scale of AI operations. With an estimated 350 million monthly Gemini users as of March, the cumulative impact remains significant 3.
Source: Fast Company
Google's report has set a new benchmark for the industry, potentially prompting other major AI players to follow suit. The company plans to continue improving efficiency through various approaches, including:
As AI becomes increasingly integrated into various sectors of the economy, understanding and mitigating its environmental impact will remain a critical focus for developers, policymakers, and users alike 4.
Summarized by
Navi
[1]
Google's AI Mode for Search is expanding globally and introducing new agentic features, starting with restaurant reservations. The update brings personalized recommendations and collaboration tools, signaling a shift towards more interactive and intelligent search experiences.
17 Sources
Technology
11 hrs ago
17 Sources
Technology
11 hrs ago
Google joins the race to provide AI services to the US government, offering its Gemini AI tools to federal agencies for just 47 cents, undercutting competitors and raising concerns about potential vendor lock-in and future costs.
7 Sources
Technology
3 hrs ago
7 Sources
Technology
3 hrs ago
Microsoft is testing new AI-powered features for Windows 11's Copilot app, including semantic file search and an improved home experience, aimed at enhancing user productivity and file management.
4 Sources
Technology
11 hrs ago
4 Sources
Technology
11 hrs ago
AI-related companies have raised $118 billion in 2025, with funding concentrated in fewer companies. Major investors include SoftBank, Meta, and venture capital firms, reflecting the growing importance of AI across various sectors.
2 Sources
Business
19 hrs ago
2 Sources
Business
19 hrs ago
Senator Amy Klobuchar advocates for new laws to combat AI-generated deepfakes after a fake video of her making controversial comments about actress Sydney Sweeney went viral, highlighting the urgent need for regulation in the AI era.
5 Sources
Policy
19 hrs ago
5 Sources
Policy
19 hrs ago