8 Sources
[1]
Sam Altman claims an average ChatGPT query uses 'roughly one fifteenth of a teaspoon' of water
Jay Peters is a news editor covering technology, gaming, and more. He joined The Verge in 2019 after nearly two years at Techmeme. OpenAI CEO Sam Altman, in a blog post published Tuesday, says an average ChatGPT query uses about 0.000085 gallons of water, or "roughly one fifteenth of a teaspoon." He made the claim as part of a broader post on his predictions about how AI will change the world. "People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes," he says. He also argues that "the cost of intelligence should eventually converge to near the cost of electricity." OpenAI didn't immediately respond to a request for comment on how Altman came to those figures. AI companies have come under scrutiny for energy costs of their technology. This year, for example, researchers forecast that AI could consume more power than Bitcoin mining by the end of the year. In an article last year, The Washington Post worked with researchers to determine that a 100-word email "generated by an AI chatbot using GPT-4" required "a little more than 1 bottle." The publication also found that water usage can depend on where a datacenter is located.
[2]
OpenAI's Sam Altman muses about superintelligence
OpenAI on Tuesday rolled out its o3-Pro model for ChatGPT Pro and Teams subscribers, slashed o3 pricing by 80 percent, and dropped a blog post from CEO Sam Altman teasing "intelligence too cheap to meter." "The average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency light bulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon," Altman said in a post. This is in line with prior outside estimates. Epoch AI published a similar figure in February. The firm said, "a GPT-4o query consumes around 0.3 watt-hours for a typical text-based question, though this increases substantially to 2.5 to 40 watt-hours for queries with very long inputs." But looking at AI energy usage on an average query basis grossly oversimplifies concerns about the technology's environmental impact, given the massive number of queries users are entering - over a billion a day as of last December, according to the company. When the MIT Technology Review explored AI energy usage recently, the conclusion did not align with Altman's claim that "Intelligence too cheap to meter is well within grasp." Rather, the publication cited research from the Lawrence Berkeley National Laboratory estimating AI-specific purposes in data centers will consume between 165 and 326 terawatt-hours of energy in 2028 - enough to power 22% of all US households. OpenAI's o3 model isn't too cheap to meter, but due to an optimized inference stack, it's 80 percent less expensive than it used to be: Input: $2 per 1M tokens; Output: $8 per 1M tokens. But there are still many cheaper models. Overall, Altman's musings skew toward techno-optimism - surprise! He posits a flood of wondrous discoveries a decade hence arising from AI superintelligence, whatever that is. "Maybe we will go from solving high-energy physics one year to beginning space colonization the next year; or from a major materials science breakthrough one year to true high-bandwidth brain-computer interfaces the next year," he said. Maybe. Or maybe not. We note that fellow futurist Elon Musk, who once cautioned about releasing the AI demon, predicted in 2016 that humans would land on Mars by 2025. Tech leaders simply pay no cost for misprediction. But Altman has more thoughts to share. In the 2030s, intelligence and energy - ideas, and the ability to make ideas happen - are going to become wildly abundant "In the 2030s, intelligence and energy - ideas, and the ability to make ideas happen - are going to become wildly abundant," he opined. "These two have been the fundamental limiters on human progress for a long time; with abundant intelligence and energy (and good governance), we can theoretically have anything else." Altman's post also exemplifies the slap-and-kiss that has characterized recent AI evangelism, citing risks but insisting all will be well in the end. "There are serious challenges to confront along with the huge upsides," Altman wrote. "We do need to solve the safety issues, technically and societally, but then it's critically important to widely distribute access to superintelligence given the economic implications." Gary Marcus, an AI expert, author, and critic, took the post as an opportunity to compare Altman to discredited Theranos CEO Elizabeth Holmes, now serving time for fraud. Had Altman consulted his own company's ChatGPT on the fundamental limiters of human progress, he'd have been presented with a far more extensive list that includes: Cognitive and Psychological Constraints, Sociopolitical Systems, Economic and Resource Constraints, Technological and Scientific Limits, Ecological and Environmental Boundaries, Cultural and Ethical Constraints, and Temporal and Physical Limits. And each of these categories comes with multiple bullet points. Yet the biggest news to emerge amid Altman's pollyannaish prognostication may be that OpenAI, nurtured by billions from Microsoft, reportedly plans to expand model availability through a partnership with Google Cloud. And it was only a year ago that Microsoft described longtime partner OpenAI as a competitor. OpenAI did not immediately respond to a request for comment. ®
[3]
Sam Altman's Lies About ChatGPT Are Growing Bolder
Would you believe OpenAI if it told you AI is fine for the planet, actually? The AI brain rot in Silicon Valley manifests in many varieties. For OpenAI’s figurehead Sam Altman, this often results in a lot of vague talk about artificial intelligence as the panacea to all of the world’s woes. Altman’s gaslighting reached new heights this week as he cited wildly deflated numbers for OpenAI’s water and electricity usage compared to numerous past studies. In a Tuesday blog post, Altman cited internal figures for how much energy and water a single ChatGPT query uses. The OpenAI CEO claimed a single prompt requires around 0.34 Wh, equivalent to what “a high-efficiency lightbulb would use in a couple of minutes.†For cooling these data centers used to process AI queries, Altman suggested a student asking ChatGPT to do their essay for them requires "0.000085 gallons of water, roughly one-fifteenth of a teaspoon.†Altman did not offer any evidence for these claims and failed to mention where his data comes from. Gizmodo reached out to OpenAI for comment, but we did not hear back. If we took the AI monger at his word, we only need to do some simple math to check how much water that actually is. OpenAI has claimed that as of December 2025, ChatGPT has 300 million weekly active users generating 1 billion messages per day. Based on the company’s and Altman’s own metrics, that would mean the chatbot uses 85,000 gallons of water per day, or a little more than 31 million gallons per year. ChatGPT is hosted on Microsoft data centers, which use quite a lot of water already. The tech giant has plans for "closed-loop" centers that don't use extra water for cooling, but these projects won't be piloted for at least another year. These data centers were already water- and power-hungry before the advent of generative AI. For Microsoft, water use spiked from 2021 to 2022 after the tech giant formulated a deal with OpenAI. A study from University of California researchers published in late 2023 claimed the older GPT-3 version of ChatGPT drank about .5 liters for every 10 to 50 queries. If you take that data at its most optimistic, OpenAI’s older model would be using 31 million liters of water per day, or 8.18 million gallons. And that’s for an older model, not today’s current, much more powerful (and far more demanding) GPT-4.1 plus its o3 reasoning model. The size of the model impacts how much energy it uses. There have been multiple studies about the environmental impact of training these models, and since they continuously have to be retrained as they grow more advanced, the electricity cost will continue to escalate. Altman’s figures don’t mention which queries are formulated through its multiple different ChatGPT products, including the most advanced $200-a-month subscription that grants access to GPT-4o. It also ignores the fact that AI images require much more energy to process than text queries. Altman’s entire post is full of big tech optimism shrouded in talking points that make little to no sense. He claims that datacenter production will be “automated,†so the cost of AI “should eventually converge to near the cost of electricity.†If we are charitable and assume Altman is suggesting that the expansion of AI will somehow offset the electricity necessary to run it, we’re still left holding today’s bag and dealing with rising global temperatures. Multiple companies have tried to solve the water and electricity issue with AI, with some landing on plans to throw data centers into the ocean or build nuclear power plants just to supply AI with the necessary electricity. Long before any nuclear plant can be built, these companies will continue to burn fossil fuels. The OpenAI CEO’s entire blog is an encapsulation of the bullheaded big tech oligarch thinking. He said that “entire classes of jobs†will go the way of the dodo, but it doesn’t matter since “the world will be getting so much richer so quickly that we’ll be able to seriously entertain new policy ideas we never could before.†Altman and other tech oligarchs have suggested we finally encourage universal basic income as a way of offsetting the impact of AI. OpenAI knows it won’t work. He’s never been serious enough about that idea that he has stumped for it harder than he has before cozying up to President Donald Trump to ensure there’s no future regulation on the AI industry. “We do need to solve the safety issues,†Altman said. But that doesn’t mean that we all shouldn’t be expanding AI to every aspect of our lives. He suggests we ignore the warming planet because AI will solve that niggling issue in due course. But if temperatures rise, requiring even more water and electricity to cool these data centers, I doubt AI can work fast enough to fix anything before it’s too late. But ignore that; just pay attention to that still unrevealed Jony Ive doohickey that may or may not gaslight you as the world burns.
[4]
Sam Altman doesn't think you should be worried about ChatGPT's energy usage - reveals exactly how much power each prompt uses
While that's not a lot in isolation, ChatGPT has over 400 million weekly users, with multiple prompts per day OpenAI CEO, Sam Altman has revealed ChatGPT's energy usage for a single prompt, and while it's lower than you might expect, on a global scale, it could have a significant impact on the planet. Writing on his blog, Altman said, "The average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon." While that might not sound like a lot as an isolated prompt, ChatGPT has approximately 400 million active weekly users, and that number is growing at a rapid rate. Bear in mind there's a growing amount of AI tools and chatbots on the market, including Google Gemini and Anthropic's Claude, so general AI energy usage will be even higher. Last month, we reported on a study from MIT Technology Review which found that a five-second AI video uses as much energy as a microwave running for an hour or more. While Altman's ChatGPT prompt energy usage reveal is nowhere near as high as that, there are still concerns considering how much people interact with AI. There's a constant concern about ChatGPT's energy consumption, and it is becoming increasingly vocal as AI usage continues to rise. While Altman's blog post will put some minds at ease, considering the relatively low energy and water usage in isolation, it could also spark more uproar. Earlier this week, a mass ChatGPT outage led to millions of people unable to interact with the chatbot. Over the 10 hour plus period, I received emails from thousands of readers who gave me a new perspective on AI. While I'd be lying if I said AI's energy consumption doesn't concern me, it would be unfair to overlook the positives of the technology and how it is improving the lives of millions. The climate crisis is not limited to me and you, but unfortunately, it's the working class that ultimately pays the price. ChatGPT's energy consumption at a mass scale may be a severe problem in the future, but then again, so are the private jets flying 10-minute flights. The AI climate concerns are not black and white, and those who criticise the impact of the technology on the planet are equally vocal about the impact of other technologies. That said, we're only at the beginning of the AI revolution, and energy consumption will continue to rise. At what point should we be worried?
[5]
OpenAI head Sam Altman claims a single ChatGPT query uses 'one 15th of a teaspoon' of water but that doesn't put AI's environmental impact in the clear
In a blog post on Tuesday(via The Verge), OpenAI supremo Sam Altman said that a typical ChatGPT query uses 'roughly one 15th of a teaspoon' of water. Altman doesn't go into any detail over that claim, including whether it's a consequence of the cooling hardware used to manage server computers or also the broader environmental impact of AI, such as manufacturing and installing all the hardware. Whatever, the immediate question is just how much water is that? It doesn't sound like a lot. But then, OpenAI is processing an awful lot of ChatGPT queries. So could it add up to a lot? Taken at face value, the answer is a clear no. To the best of our knowledge, ChatGPT handles roughly one billion queries per day. One 15th of a teaspoon is 0.000085 gallons. Multiply that by a billion and you get 85,000 gallons of water daily. Now, according to the US Geological Survey, water usage in the USA is around 322 billion gallons daily. So, ChatGPT's water usage works out at 0.0000026% of the US daily national usage, according to those figures. Taking a different route, the US EPA reckons the average American uses 82 gallons of water "at home" daily. Presumably, that excludes water usage outside of the home. That 15th of a teaspoon then works out to 0.0001% of an individual's at-home daily water usage per query. So, you'd have to ping ChatGPT an awful, awful lot to make much of a difference to your overall water usage. Of course, all that is to assume that Altman's claim is accurate. And on that note, we'd be cautious, at the very least. Moreover, even if it is accurate, water usage is hardly the number one concern around AI. Sheer energy consumption is a major worry. Some estimates already put AI at one-fifth of all data centre power consumption, for instance. Meanwhile, Altman himself is already raising funds to build fusion reactors to solve what he perceives as the power problem posed by AI. So, even if water consumption isn't much of a problem, AI hardly has a clean bill of health when it comes to environmental impacts. On the other hand, if you really start to think through all the issues involved, it quickly becomes very complicated indeed. At the extreme end of the philosophical scale, some would no doubt argue that the long-term benefits of AI will easily outweigh any transient impact as the technology scales up. If super-intelligent AI helps us work out how to produce entirely clean energy and somehow solves all kinds of other environmental problems, well, it will all have been worth it. But until that day comes, it probably makes sense to be cautious and minimise the environmental impact of AI.
[6]
Each ChatGPT Query Uses Merely a 15th of a Teaspoon of Water, Says Sam Altman | AIM
However, Altman's estimate is nearly one-tenth of what analysts projected. OpenAI CEO Sam Altman has disclosed for the first time the average energy and water consumption of a single ChatGPT query, which is much less than what analysts and environmentalists had projected all this time. "The average query uses about 0.34 watt-hours," Altman said in a blog post titled The Gentle Singularity. "It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon." For comparison, 0.34 watt-hours is equivalent to running a high-efficiency lightbulb for a couple of minutes. According to a study on ChatGPT's water consumption by A. Shaji George, an expert in Information and Communications Technology (ICT), the AI chatbot consumes 0.5 litres of water during each of its lengthy conversations with a user. This applies to all AI systems and LLMs in place. Half a litre of water would mean the amount required to cook two packets of Maggi instant noodles. According to data provided by Semrush, ChatGPT was ranked as the eighth most visited site in the world as of March 2025, with approximately 5.56 billion visits. Meanwhile, research shows that every question potentially uses around 10 times more electricity than a simple Google search, with an average of 2.9 watt-hours of energy. However, Altman's estimate is nearly one-tenth of what analysts projected. Previously, he shared that polite courtesies with ChatGPT, such as "please" and "thank you," have cost the company tens of millions of dollars in electricity expenses. This revelation comes amid increasing scrutiny of AI's resource usage as models grow in size and adoption. Altman also laid out a longer-term forecast on AI cost trends, stating, "As datacenter production gets automated, the cost of intelligence should eventually converge to near the cost of electricity." Environmental analysts note that while per-query usage is low, aggregate consumption remains a concern. With hundreds of millions of queries each day, energy and water use from AI data centres could grow substantially. In the same post, Altman acknowledged these broader implications, saying, "The economic value creation has started a flywheel of compounding infrastructure buildout to run these increasingly-powerful AI systems." The release of these statistics appears to be part of OpenAI's broader attempt to improve transparency about its AI infrastructure. As public and regulatory interest in AI's environmental impact grows, such disclosures are likely to play a larger role in shaping both company policy and public perception.
[7]
Sam Altman Claims ChatGPT Uses One-Fifteenth Of A Teaspoon Of Water Per Query, Predicts AI's Cost Will Soon Match Electricity, But Offers No Methodology
The AI frenzy is not going away any time soon as companies are increasingly investing in the technology and integrating it into their products and services. OpenAI, since the inception of ChatGPT, has pushed the use of artificial intelligence even further, with others jumping in to expand the access and capabilities as well. With the growing application of technology, there is an increased concern regarding the ethical implications of this rapid expansion, as well as apprehensions regarding its environmental impact. Sam Altman has recently shed light on how much resources these AI models consume. With the rapid expansion of AI and its broader use, the tech community and users are increasingly concerned about ensuring ethical AI is being promoted and also constantly occupied with the environmental impact of the growing use of the technology. OpenAI's CEO, Sam Altman, recently shared about the consumption of water that an average ChatGPT query involves, and it is not what you would expect. In the blog post called "The Gentle Singularity," Sam Altman emphasized how artificial intelligence is meant to shape the infrastructure and the world at large economically, socially, and, most importantly, environmentally. While Altman did not share the methodology that was used to reach the figure and also not what factors were included in the calculation, as per his claim: An average ChatGPT query uses about 0.000085 gallons of water or roughly one-fifteenth of a teaspoon. This was not the only consumption amount he shared, as he laid down the numbers for energy consumption as well. As per Altman's words: People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. While making a prediction about the future of AI and the resources to be used, Altman suggested that as AI would become more advanced and efficient, the cost of generating intelligence would come down to the cost of electricity used to run the hardware. While he believes scaling massively is what would help bring down the cost, the critics and environmentalists keep on highlighting the current resource cost being incurred and how the amount of consumption represented by Altman might be understated, as the AI tools seem to consume more than one-fifteenth of a teaspoon per query and without any proper methodology laid out, find it hard to believe.
[8]
Every Time You Ask ChatGPT Something, It Uses As Much Electricity As An Oven Does In One Second, Says Sam Altman: The 'Cost Of Intelligence' Will Only Lower
Enter your email to get Benzinga's ultimate morning update: The PreMarket Activity Newsletter OpenAI chief executive Sam Altman says that a ChatGPT query consumes roughly 0.34 watt-hours of electricity, which is "about what an oven would use in a little over one second." What Happened: Altman, in a blog post he published on Tuesday, also mentioned that each ChatGPT query consumes "about 0.000085 gallons of water," which is roughly one-fifteenth of a teaspoon. Altman argues that "the cost of intelligence should eventually converge to near the cost of electricity," positioning the chatbot as comparatively frugal even as environmental watchdogs scrutinize AI's hidden climate bill. OpenAI has not publicly shared details of their methodology, but the disclosure lands as researchers warn that data-center power demand could outstrip Bitcoin Mining by next year. One study from Vrije Universiteit Amsterdam projects AI could soon swallow nearly half the electricity flowing into global server farms. See also: Elon Musk Says He 'Regrets' War Of Words With Trump: 'Went Too Far' Water use complicates the equation. A Washington Post investigation found that drafting a 100-word email with GPT-4 required "a little more than one bottle" of water, with consumption varying widely by data-center location. MIT researchers likewise warned in January that generative models are driving up both energy and water footprints as companies race to scale. Why It Matters: The Department of Energy projects that U.S. data centers may consume up to 12% of the nation's electricity by 2028, a share that could climb as AI adoption accelerates. Altman's efficiency pitch follows corporate pledges to curb resource demand, yet Yale's Environment 360 notes hyperscale facilities still guzzle millions of gallons annually. Planet Detroit adds that generative systems may use 33 times more energy than conventional software for the same task. Whether the teaspoon math holds up, Altman insists smarter algorithms and automated chip fabrication will keep driving the price and resource cost of "intelligence" lower. Image via Shutterstock Read next: Anduril CEO Palmer Luckey Confirms IPO Plans To Compete For Trillion-Dollar Defense Contracts: 'We Are Definitely Going To Be A Publicly Traded...' Market News and Data brought to you by Benzinga APIs
Share
Copy Link
OpenAI CEO Sam Altman's recent claims about ChatGPT's energy and water usage have ignited discussions about the environmental impact of AI technology, with experts and critics questioning the accuracy and implications of these figures.
OpenAI CEO Sam Altman has sparked a debate about the environmental impact of AI with his recent claims about ChatGPT's energy and water usage. In a blog post, Altman stated that an average ChatGPT query uses about 0.34 watt-hours of energy and 0.000085 gallons of water, which he described as "roughly one-fifteenth of a teaspoon" 1.
Source: Gizmodo
These figures have been met with skepticism from experts and critics. While Altman's claims align with some previous estimates, such as those published by Epoch AI in February 2, they have been criticized for oversimplifying the environmental concerns associated with AI technology.
Gary Marcus, an AI expert and critic, went as far as comparing Altman to discredited Theranos CEO Elizabeth Holmes, highlighting the skepticism surrounding these claims 2.
Critics argue that focusing on individual query usage ignores the larger picture of AI's environmental impact. With ChatGPT processing approximately one billion queries per day, the cumulative effect is significant 4.
Research from the Lawrence Berkeley National Laboratory estimates that AI-specific purposes in data centers could consume between 165 and 326 terawatt-hours of energy by 2028, enough to power 22% of all US households 2.
Source: Analytics India Magazine
The water usage figures provided by Altman have also been questioned. A study from University of California researchers claimed that the older GPT-3 version of ChatGPT used about 0.5 liters of water for every 10 to 50 queries, significantly more than Altman's estimate 3.
Microsoft, which hosts ChatGPT on its data centers, has seen a spike in water usage from 2021 to 2022 after partnering with OpenAI 3.
Altman's blog post also touched on future energy needs for AI, suggesting that data center production will be automated and that the cost of AI "should eventually converge to near the cost of electricity" 2. However, this optimistic view has been challenged by current realities and the immediate environmental concerns.
Some companies are exploring alternative solutions to address the energy and water demands of AI, including proposals to place data centers in the ocean or build dedicated nuclear power plants 3.
Source: Benzinga
While the debate continues, experts emphasize the need to balance technological progress with environmental responsibility. The AI climate concerns are not black and white, and critics of AI's environmental impact are equally vocal about other technologies' effects on the planet 4.
As AI technology continues to advance and its usage grows, the industry faces increasing pressure to address and mitigate its environmental impact. The challenge lies in harnessing the potential benefits of AI while minimizing its ecological footprint 5.
AMD CEO Lisa Su reveals new MI400 series AI chips and partnerships with major tech companies, aiming to compete with Nvidia in the rapidly growing AI chip market.
8 Sources
Technology
6 hrs ago
8 Sources
Technology
6 hrs ago
Meta has filed a lawsuit against Joy Timeline HK Limited, the developer of the AI 'nudify' app Crush AI, for repeatedly violating advertising policies on Facebook and Instagram. The company is also implementing new measures to combat the spread of AI-generated explicit content across its platforms.
17 Sources
Technology
14 hrs ago
17 Sources
Technology
14 hrs ago
Mattel, the iconic toy manufacturer, partners with OpenAI to incorporate artificial intelligence into toy-making and content creation, promising innovative play experiences while prioritizing safety and privacy.
14 Sources
Business and Economy
14 hrs ago
14 Sources
Business and Economy
14 hrs ago
A critical security flaw named "EchoLeak" was discovered in Microsoft 365 Copilot, allowing attackers to exfiltrate sensitive data without user interaction. The vulnerability highlights potential risks in AI-integrated systems.
5 Sources
Technology
22 hrs ago
5 Sources
Technology
22 hrs ago
Spanish AI startup Multiverse Computing secures $217 million in funding to advance its quantum-inspired AI model compression technology, promising to dramatically reduce the size and cost of running large language models.
5 Sources
Technology
14 hrs ago
5 Sources
Technology
14 hrs ago