Curated by THEOUTPOST
On Thu, 13 Feb, 12:03 AM UTC
2 Sources
[1]
ChatGPT may not be as power-hungry as once assumed | TechCrunch
ChatGPT, OpenAI's chatbot platform, may not be as power-hungry as once assumed. But its appetite largely depends on how ChatGPT is being used, and the AI models that are answering the queries, according to a new study. A recent analysis by Epoch AI, a nonprofit AI research institute, attempted to calculate how much energy a typical ChatGPT query consumes. A commonly-cited stat is that ChatGPT requires around 3 watt-hours of power to answer a single question, or 10 times as much as a Google search. Epoch believes that's an overestimate. Using OpenAI's latest default model for ChatGPT, GPT-4o, as a reference, Epoch found the average ChatGPT query consumes around 0.3 watt-hours -- less than many household appliances. "The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car," Joshua You, the data analyst at Epoch who conducted the analysis, told TechCrunch. AI's energy usage -- and its environmental impact, broadly speaking -- is the subject of contentious debate as AI companies look to rapidly expand their infrastructure footprints. Just last week, a group of over 100 organizations published an open letter calling on the AI industry and regulators to ensure that new AI data centers don't deplete natural resources and force utilities to rely on non-renewable sources of energy. You told TechCrunch his analysis was spurred by what he characterized as outdated previous research. You pointed out, for example, that the author of the report that arrived at the 3-watt-hours estimate assumed OpenAI used older, less efficient chips to run its models. "I've seen a lot of public discourse that correctly recognized that AI was going to consume a lot of energy in the coming years, but didn't really accurately describe the energy that was going to AI today," You said. "Also, some of my colleagues noticed that the most widely-reported estimate of 3 watt-hours per query was based on fairly old research, and based on some napkin math seemed to be too high." Granted, Epoch's 0.3 watt-hours figure is an approximation, as well; OpenAI hasn't published the details needed to make a precise calculation. The analysis also doesn't consider the additional energy costs incurred by ChatGPT features like image generation, or input processing. You acknowledged that "long input" ChatGPT queries -- queries with long files attached, for instance -- likely consume more electricity upfront than a typical question. You said he does expect baseline ChatGPT power consumption to rise, however. "[The] AI will get more advanced, training this AI will probably require much more energy, and this future AI may be used much more intensely -- handling much more tasks, and more complex tasks, than how people use ChatGPT today," You said. While there have been remarkable breakthroughs in AI efficiency in recent months, the scale at which AI is being deployed is expected to drive enormous, power-hungry infrastructure expansion. In the next two years, AI data centers may need close to all of California's 2022 power capacity (68 GW), according to a Rand report. By 2030, training a frontier model could demand power output equivalent to that of eight nuclear reactors (8 GW), the report predicted. ChatGPT alone reaches an enormous -- and expanding -- number of people, making its server demands similarly massive. OpenAI, along with several investment partners, plans to spend billions of dollars on new AI data center projects over the next few years. OpenAI's attention -- along with the rest of the AI industry's -- is also shifting to so-called reasoning models, which are generally more capable in terms of the tasks they can accomplish, but require more computing to run. As opposed to models like GPT-4o, which respond to queries nearly instantaneously, reasoning models "think" for seconds to minutes before answering, a process that sucks up more computing -- and thus power. "Reasoning models will increasingly take on tasks that older models can't, and generate more [data] to do so, and both require more data centers," You said. OpenAI has begun to release more power-efficient reasoning models like o3-mini. But it seems unlikely, at least at this juncture, the efficiency gains will offset the increased power demands from reasoning models' "thinking" process and growing AI usage around the world. You suggested that people worried about their AI energy footprint use apps such as ChatGPT infrequently, or select models that minimize the computing necessary -- to the extent that's realistic. "You could try using smaller AI models like [OpenAI's] GPT-4o-mini," You said, "and sparingly use them in a way that requires processing or generating a ton of data."
[2]
New research says ChatGPT likely consumes '10 times less' energy than we initially thought, making it about the same as Google search
It's easy to slate AI in all its manifestations -- trust me, I should know, I do so often enough -- but some recent research from Epoch AI (via TechCrunch) suggests that we might be a little hasty if we're trashing its energy use (yes, that's the same Epoch AI that recently dropped a new, difficult math benchmark for AI). According to Epoch AI, ChatGPT likely consumes just 0.3 Wh of electricity, "10 times less" than the popular older estimate which claimed about 3 Wh. Given a Google search amounts to 0.0003 kWh of energy consumption per search, and based on the older 3 Wh estimate, two years ago Alphabet Chairman John Hennessey said that an LLM exchange would probably cost 10 times more than a Google search in energy. If Epoch AI's new estimate is correct, it seems that a likely GPT-4o interaction actually consumes the same amount of energy as a Google search. Server energy use isn't something that tends to cross most people's minds while using a cloud service -- the 'cloud' is so far removed from our homes that it seems a little ethereal. I know I often forget there are any additional energy costs at all, other than what my own device consumes, when using ChatGPT. Thankfully I'm not a mover or a shaker in the world of energy policy, because of course LLM interactions consume energy. Let's not forget how LLMs work: they undertake shedloads of data training (consuming shedloads of energy), then once they've been trained and are interacting, they still need to pull from gigantic models to process even simple instructions or queries. That's the nature of the beast. And that beast needs feeding energy to keep up and running. It's just that apparently that's less energy than we might have originally thought on a per-interaction basis: "For context, 0.3 watt-hours is less than the amount of electricity that an LED lightbulb or a laptop consumes in a few minutes. And even for a heavy chat user, the energy cost of ChatGPT will be a small fraction of the overall electricity consumption of a developed-country resident." Epoch AI explains that there are a few differences between how it's worked out this new estimate and how the original 3 Wh estimate was calculated. Essentially, the new estimate uses a "more realistic assumption for the number of output tokens in a typical chatbot usage", whereas the original estimate assumed output tokens equivalent to about 1,500 words on average (tokens are essentially units of text such as a word). The new one also assumes just 70% of peak server power and computation being performed on a newer chip (Nvidia's H100 rather than an A100). All these changes -- which seem reasonable to my eyes and ears -- paint a picture of a much less power-hungry ChatGPT. However, Epoch AI points out that "there is a lot of uncertainty here around both parameter count, utilization, and other factors". Longer queries, for instance, it says could increase energy consumption "substantially to 2.5 to 40 watt-hours." It's a complicated story, but should we expect any less? In fact, let me muddy the waters a little more for us. We also need to consider the benefits of AI for energy consumption. A productive technology doesn't exist in a vacuum, after all. For instance, use of AI such as ChatGPT could help bring about breakthroughs in energy production that decrease energy use across the board. And use of AI could increase productivity in areas that reduce energy in other ways; for instance, a manual task that would have required you to keep your computer turned on and consuming power for 10 minutes might be done in one minute with the help of AI. On the other hand, there's the cost of AI training to consider. But on the peculiar third hand -- where did that come from? -- the benefits of LLM training are starting to plateau, which means there might be less large-scale data training going forwards. Plus, aren't there always additional variables? With Google search, for instance, there's the presumed cost of constant web indexing and so on, not just the search interaction and results page generation. In other words, it's a complicated picture, and as with all technologies, AI probably shouldn't be looked at in a vacuum. Apart from its place on the mathematician's paper, energy consumption is never an isolated variable. Ultimately, what we care about is the health and productivity of the entire system, the economy, society, and so on. As always, such debates require consideration of multi-multi-variate equations in a cost-benefit analysis, and it's difficult to get the full picture, especially when much of that picture depends on an uncertain future. Which somewhat defines the march of capitalism, does it not? The back and forth 'but actually' that characterises these discussions gets trampled under the boots of the technology which marches ahead regardless. And ultimately, while this new 0.3 Wh estimate is certainly a pleasant development, it's still just an estimate, and Epoch AI is very clear about this: "More transparency from OpenAI and other major AI companies would help produce a better estimate." More transparency would be nice, but I won't hold my breath.
Share
Share
Copy Link
A recent study by Epoch AI suggests that ChatGPT's energy consumption may be significantly lower than previously thought, potentially on par with a Google search.
A new study by Epoch AI, a nonprofit AI research institute, has challenged the widely-held belief about ChatGPT's energy consumption. The analysis suggests that OpenAI's chatbot may be significantly less power-hungry than previously assumed, potentially consuming as little energy as a Google search 1.
According to Joshua You, a data analyst at Epoch AI, the average ChatGPT query consumes approximately 0.3 watt-hours of electricity. This figure is dramatically lower than the commonly cited estimate of 3 watt-hours, which was based on older research and assumptions 1. The new estimate puts ChatGPT's energy use on par with a Google search, which consumes about 0.0003 kWh per query 2.
The revised estimate takes into account several factors:
However, You acknowledges that there is still uncertainty surrounding various factors, and longer queries could substantially increase energy consumption to between 2.5 and 40 watt-hours 2.
This reassessment of ChatGPT's energy consumption comes at a time of intense debate about AI's environmental impact. While the new estimate suggests a lower individual query footprint, the scale of AI deployment is expected to drive significant infrastructure expansion 1.
Despite the potentially lower energy consumption per query, experts anticipate that AI power demands will rise:
The AI industry, including OpenAI, is planning substantial investments in new data center projects. However, there are calls for greater transparency from AI companies to produce more accurate energy consumption estimates 2.
While energy consumption is a crucial consideration, it's important to view AI's impact holistically:
As AI technology continues to evolve, ongoing research and transparency will be crucial in understanding and managing its energy footprint and overall environmental impact.
As AI technology advances, concerns grow over its environmental impact. ChatGPT and other AI models are consuming enormous amounts of energy and water, raising questions about sustainability and resource management in the tech industry.
3 Sources
3 Sources
Chinese startup DeepSeek claims to have created an AI model that matches the performance of established rivals at a fraction of the cost and carbon footprint. However, experts warn that increased efficiency might lead to higher overall energy consumption due to the Jevons paradox.
5 Sources
5 Sources
The rapid advancement of artificial intelligence is driving unprecedented electricity demands, raising concerns about sustainability and the need for innovative solutions in the tech industry.
4 Sources
4 Sources
The rapid growth of AI technology has raised concerns about its environmental sustainability. This story explores the energy consumption of AI models, their carbon footprint, and potential solutions for a greener AI industry.
2 Sources
2 Sources
As AI's power consumption skyrockets, researchers and tech companies are exploring ways to make AI more energy-efficient while harnessing its potential to solve energy and climate challenges.
7 Sources
7 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved