ChatGPT's Energy Consumption: New Study Challenges Previous Estimates

2 Sources

A recent study by Epoch AI suggests that ChatGPT's energy consumption may be significantly lower than previously thought, potentially on par with a Google search.

News article

ChatGPT's Energy Footprint: A Reassessment

A new study by Epoch AI, a nonprofit AI research institute, has challenged the widely-held belief about ChatGPT's energy consumption. The analysis suggests that OpenAI's chatbot may be significantly less power-hungry than previously assumed, potentially consuming as little energy as a Google search 1.

Revised Energy Estimates

According to Joshua You, a data analyst at Epoch AI, the average ChatGPT query consumes approximately 0.3 watt-hours of electricity. This figure is dramatically lower than the commonly cited estimate of 3 watt-hours, which was based on older research and assumptions 1. The new estimate puts ChatGPT's energy use on par with a Google search, which consumes about 0.0003 kWh per query 2.

Factors Influencing the New Estimate

The revised estimate takes into account several factors:

  1. More realistic assumptions about the number of output tokens in typical chatbot usage.
  2. Consideration of server utilization at 70% of peak power.
  3. Use of newer, more efficient chips like Nvidia's H100 2.

However, You acknowledges that there is still uncertainty surrounding various factors, and longer queries could substantially increase energy consumption to between 2.5 and 40 watt-hours 2.

Implications for AI's Environmental Impact

This reassessment of ChatGPT's energy consumption comes at a time of intense debate about AI's environmental impact. While the new estimate suggests a lower individual query footprint, the scale of AI deployment is expected to drive significant infrastructure expansion 1.

Future Considerations

Despite the potentially lower energy consumption per query, experts anticipate that AI power demands will rise:

  1. Advanced AI models may require more energy for training and operation.
  2. The scale of AI deployment is expected to increase dramatically.
  3. Reasoning models, which "think" for longer periods, may consume more power than current models 1.

Industry Response and Transparency

The AI industry, including OpenAI, is planning substantial investments in new data center projects. However, there are calls for greater transparency from AI companies to produce more accurate energy consumption estimates 2.

Balancing Act: Energy Use vs. Potential Benefits

While energy consumption is a crucial consideration, it's important to view AI's impact holistically:

  1. AI could potentially lead to breakthroughs in energy production and efficiency.
  2. Increased productivity from AI use might offset some energy costs.
  3. The benefits of large-scale AI training may be plateauing, potentially reducing future energy demands for training 2.

As AI technology continues to evolve, ongoing research and transparency will be crucial in understanding and managing its energy footprint and overall environmental impact.

Explore today's top stories

Elon Musk's xAI Open-Sources Grok 2.5, Promises Grok 3 Release in Six Months

Elon Musk's AI company xAI has open-sourced the Grok 2.5 model on Hugging Face, making it available for developers to access and explore. Musk also announced plans to open-source Grok 3 in about six months, signaling a commitment to transparency and innovation in AI development.

TechCrunch logoengadget logoDataconomy logo

7 Sources

Technology

19 hrs ago

Elon Musk's xAI Open-Sources Grok 2.5, Promises Grok 3

Nvidia Unveils Plans for Light-Based GPU Interconnects by 2026, Revolutionizing AI Data Centers

Nvidia announces plans to implement silicon photonics and co-packaged optics for AI GPU communication by 2026, promising higher transfer rates and lower power consumption in next-gen AI data centers.

Tom's Hardware logoDataconomy logo

2 Sources

Technology

3 hrs ago

Nvidia Unveils Plans for Light-Based GPU Interconnects by

Netflix Unveils Generative AI Guidelines for Content Creation

Netflix has released new guidelines for using generative AI in content production, outlining low-risk and high-risk scenarios and emphasizing responsible use while addressing industry concerns.

Mashable logoDataconomy logo

2 Sources

Technology

3 hrs ago

Netflix Unveils Generative AI Guidelines for Content

Breakthrough in Spintronics: Turning Spin Loss into Energy for Ultra-Low-Power AI Chips

Scientists at KIST have developed a new device principle that utilizes "spin loss" as a power source for magnetic control, potentially revolutionizing the field of spintronics and paving the way for ultra-low-power AI chips.

ScienceDaily logonewswise logo

2 Sources

Technology

3 hrs ago

Breakthrough in Spintronics: Turning Spin Loss into Energy

Cloudflare Unveils New Zero Trust Tools for Secure AI Adoption in Enterprises

Cloudflare introduces new features for its Cloudflare One zero-trust platform, aimed at helping organizations securely adopt, build, and deploy generative AI applications while maintaining security and privacy standards.

SiliconANGLE logoMarket Screener logo

2 Sources

Technology

3 hrs ago

Cloudflare Unveils New Zero Trust Tools for Secure AI
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo