Curated by THEOUTPOST
On Fri, 25 Apr, 12:02 AM UTC
3 Sources
[1]
This tool estimates how much electricity your chatbot messages consume | TechCrunch
Ever wonder how much electricity you're using when you prompt, or thank, an AI model? Hugging Face engineer Julien Delavande did, so he built a tool to help arrive at the answer. AI models consume energy each time they're run. They're run on GPUs and specialized chips that need a lot of power to carry out the associated computational workloads at scale. It's not easy to pin down model power consumption, but it's widely expected that growing usage of AI technologies will drive electricity needs to new heights in the next couple of years. The demand for more power to fuel AI has led some companies to pursue environmentally unfriendly strategies. Tools like Delavande's aim to bring attention to this, and perhaps give some AI users pause. "Even small energy savings can scale up across millions of queries -- model choice or output length can lead to major environmental impact," Delavande and the tool's other creators wrote in a statement. ⚡️ Ever wondered how much energy is used every time you send a message to ChatGPT? We just built a version of Chat UI that shows how much energy your message consumes -- in real time. Should all chatbots display this? Details below 👇👇 pic.twitter.com/tBL0Pw51PW -- Delavande Julien (@juliendelavande) April 22, 2025 Delavande's tool is designed to work with Chat UI, an open-source front-end for models like Meta's Llama 3.3 70B and Google's Gemma 3. The tool estimates the energy consumption of messages sent to and from a model in real time, reporting consumption in Watt-hours or Joules. It also compares model energy usage to that of common household appliances, like microwaves and LEDs. According to the tool, asking Llama 3.3 70B to write a typical email uses approximately 0.1841 Watt-hours -- equivalent to running a microwave for 0.12 seconds or using a toaster for 0.02 seconds. It's worth remembering that the tool's estimates are only that -- estimates. Delavande makes no claim that they're incredibly precise. Still, they serve as a reminder that everything -- chatbots included -- has a cost. "With projects like the AI energy score and broader research on AI's energy footprint, we're pushing for transparency in the open source community. One day, energy usage could be as visible as nutrition labels on food!" Delavande and his co-creators wrote.
[2]
How much energy does a single chatbot prompt use? This AI tool can show you
AI systems require a lot of energy to function, but no one has exact numbers, especially not for individual chatbot queries. To address this, an engineer at Hugging Face built a tool to try to find out. Also: The top 20 AI tools of 2025 - and the #1 thing to remember when you use them The language surrounding AI infrastructure, much of which emphasizes "the cloud" and other air-themed metaphors, can obscure the fact that it relies on energy-hungry computers. To run complex computations quickly, AI systems require powerful chips, multiple GPUs, and expansive data centers, all of which consume power when you ask ChatGPT a question. This is part of why free-tier access to many chatbots comes with usage limits: electricity costs make computing expensive for the hosting company. To demystify some of this, Hugging Face engineer Julien Delavande built an AI chat interface that shows real-time energy use estimates for your conversations. It compares how much energy various models, tasks, and requests use -- for example, a prompt that requires reasoning is likely to use more energy than a simple fact-finding query. In addition to Watt-hours and Joules, the tool shows usage in more accessible metrics, such as the percentage of a phone charge or driving time, using data from the Environmental Protection Agency (EPA). When I asked Chat UI Energy about the weather in New York City, the first comparison it showed me was for a phone charge (my query used about 9.5%). When I clicked on that estimate, the tool toggled through other equivalent comparisons, including 45 minutes of LED bulb use, 1.21 seconds of microwave use, and 0.15 seconds of toaster energy. As you continue chatting, the bot shows the total energy usage and time of the conversation at the bottom of the chat window. Also: Why I just added Gemini 2.5 Pro to the very short list of AI tools I pay for Though my query was very simple, it relied on access to the internet, which the bot doesn't have. That may be why it took 90 seconds (and more energy than expected) to return a response. Still, even as an estimate, 45 minutes of LED bulb use seems anecdotally high, which puts the energy used by much more complex, multi-step prompts into perspective. Only AI companies know how much energy their systems really use, but studies estimate that, based on demand trends, it's only increasing. A 2024 International Energy Agency report predicts electricity demand will increase globally by 3.4% -- a faster rate than usual -- by 2026, driven in part by "a notable expansion" of data centers. A Berkeley Lab report also found data centers to be accelerating, with an expected growth rate of "13% to 27% between 2023 and 2028." The release emphasizes the distinction between open-source platforms like Hugging Face and more opaque AI companies. Also: Copilot just knocked my AI coding tests out of the park (after choking on them last year) "With projects like the AI Energy Score and broader research on AI's energy footprint, we're pushing for transparency in the open-source community," the chat's creators said in the announcement. "One day, energy usage could be as visible as nutrition labels on food!" You can try the chatbot here and toggle through several open-source models, including Google Gemma 3, Meta's Llama 3.3, and Mistral Nemo Instruct.
[3]
This Tool Tells You How Much Energy Your AI Chatbot Uses
Thanking your AI chatbot when it provides a response to a query may not require much energy on its own, but the cost of your interactions will add up over time -- and a new tool from Hugging Face can tell you approximately how much. The ChatUI energy interface estimates the energy consumption involved in messaging with an AI model in real time, with comparisons to common appliances like LED light bulbs and phone chargers. You can type in any query or utilize one of the suggested inputs to generate a response along with the corresponding energy requirement. For example, a "professional email" took an AI just over 25 seconds to create and required 0.5 watt-hours, the equivalent of 2.67% of a phone charge. A 90-second script for testing transcription software required 1.4 watt-hours -- 7.37% of a phone charge, 22 minutes of an LED bulb, or 0.6 seconds of microwave use. (Responding to my "thank you" equaled 0.2% of a phone charge.) Note that ChatUI is approximating, not providing exact measurements. The tool can run on various models, including Meta's Llama 3.3 70B and Google's Gemma 3. According to estimates from the International Energy Agency (IEA), a single ChatGPT request requires nearly 10 times the electricity of a typical Google search at 2.9 watt-hours vs. 0.2 watt-hours, respectively. If ChatGPT was utilized in all 9 billion daily searches, that would require nearly 10 terawatt-hours of additional electricity per year, the equivalent usage of 1.5 million European Union residents. AI's environmental impact comes in large part from the power and water demands of running data centers. The IEA expects global AI electricity consumption to be ten times in 2026 what it was in 2023, and the water requirements by 2027 could be more than the entire annual usage of all of Denmark.
Share
Share
Copy Link
Hugging Face engineer Julien Delavande creates a tool to estimate the electricity consumption of AI chatbot interactions, raising awareness about the environmental impact of AI usage.
Hugging Face engineer Julien Delavande has developed a groundbreaking tool that estimates the electricity consumption of AI chatbot interactions in real-time. This innovation comes as the AI industry faces increasing scrutiny over its environmental impact and energy demands 1.
The tool, designed to work with Chat UI, an open-source front-end for models like Meta's Llama 3.3 70B and Google's Gemma 3, provides users with immediate feedback on the energy costs of their AI interactions. It reports consumption in Watt-hours or Joules and compares model energy usage to common household appliances 1.
According to the tool's estimates:
The tool's release comes amid growing concerns about AI's energy footprint. The International Energy Agency (IEA) estimates that:
This tool aims to bring transparency to AI energy consumption, potentially influencing both users and developers. Delavande and his co-creators envision a future where "energy usage could be as visible as nutrition labels on food" 1.
The initiative aligns with broader research on AI's energy footprint and could lead to more energy-efficient AI development practices. It also highlights the distinction between open-source platforms like Hugging Face and more opaque AI companies in terms of transparency 2.
As AI usage continues to grow, tools like this may become crucial in managing and understanding the technology's environmental impact, potentially influencing policy decisions and consumer behavior in the rapidly evolving AI landscape.
A recent study by Epoch AI suggests that ChatGPT's energy consumption may be significantly lower than previously thought, potentially on par with a Google search.
2 Sources
2 Sources
As AI technology advances, concerns grow over its environmental impact. ChatGPT and other AI models are consuming enormous amounts of energy and water, raising questions about sustainability and resource management in the tech industry.
3 Sources
3 Sources
Chinese startup DeepSeek claims to have created an AI model that matches the performance of established rivals at a fraction of the cost and carbon footprint. However, experts warn that increased efficiency might lead to higher overall energy consumption due to the Jevons paradox.
5 Sources
5 Sources
As generative AI usage surges, concerns about its ecological footprint are mounting. This story explores the environmental impact of AI in terms of energy consumption, water usage, and electronic waste.
2 Sources
2 Sources
The rapid growth of AI technology has raised concerns about its environmental sustainability. This story explores the energy consumption of AI models, their carbon footprint, and potential solutions for a greener AI industry.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved