ChatGPT's Energy Consumption: New Study Challenges Previous Estimates

2 Sources

Share

A recent study by Epoch AI suggests that ChatGPT's energy consumption may be significantly lower than previously thought, potentially on par with a Google search.

News article

ChatGPT's Energy Footprint: A Reassessment

A new study by Epoch AI, a nonprofit AI research institute, has challenged the widely-held belief about ChatGPT's energy consumption. The analysis suggests that OpenAI's chatbot may be significantly less power-hungry than previously assumed, potentially consuming as little energy as a Google search

1

.

Revised Energy Estimates

According to Joshua You, a data analyst at Epoch AI, the average ChatGPT query consumes approximately 0.3 watt-hours of electricity. This figure is dramatically lower than the commonly cited estimate of 3 watt-hours, which was based on older research and assumptions

1

. The new estimate puts ChatGPT's energy use on par with a Google search, which consumes about 0.0003 kWh per query

2

.

Factors Influencing the New Estimate

The revised estimate takes into account several factors:

  1. More realistic assumptions about the number of output tokens in typical chatbot usage.
  2. Consideration of server utilization at 70% of peak power.
  3. Use of newer, more efficient chips like Nvidia's H100

    2

    .

However, You acknowledges that there is still uncertainty surrounding various factors, and longer queries could substantially increase energy consumption to between 2.5 and 40 watt-hours

2

.

Implications for AI's Environmental Impact

This reassessment of ChatGPT's energy consumption comes at a time of intense debate about AI's environmental impact. While the new estimate suggests a lower individual query footprint, the scale of AI deployment is expected to drive significant infrastructure expansion

1

.

Future Considerations

Despite the potentially lower energy consumption per query, experts anticipate that AI power demands will rise:

  1. Advanced AI models may require more energy for training and operation.
  2. The scale of AI deployment is expected to increase dramatically.
  3. Reasoning models, which "think" for longer periods, may consume more power than current models

    1

    .

Industry Response and Transparency

The AI industry, including OpenAI, is planning substantial investments in new data center projects. However, there are calls for greater transparency from AI companies to produce more accurate energy consumption estimates

2

.

Balancing Act: Energy Use vs. Potential Benefits

While energy consumption is a crucial consideration, it's important to view AI's impact holistically:

  1. AI could potentially lead to breakthroughs in energy production and efficiency.
  2. Increased productivity from AI use might offset some energy costs.
  3. The benefits of large-scale AI training may be plateauing, potentially reducing future energy demands for training

    2

    .

As AI technology continues to evolve, ongoing research and transparency will be crucial in understanding and managing its energy footprint and overall environmental impact.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo