17 Sources
17 Sources
[1]
Sam Altman would like remind you that humans use a lot of energy, too | TechCrunch
OpenAI CEO Sam Altman addressed concerns about AI's environmental impact this week while speaking at an event hosted by The Indian Express. For one thing, Altman -- who was in India for a major AI summit -- said concerns about AI's water usage are "totally fake," though he acknowledged it was a real issue when "we used to do evaporative cooling in data centers." "Now that we don't do that, you see these things on the internet where, 'Don't use ChatGPT, it's 17 gallons of water for each query' or whatever," Altman said. "This is completely untrue, totally insane, no connection to reality." He added that it's "fair" to be concerned about "the energy consumption -- not per query, but in total, because the world is now using so much AI." In his view, this means the world needs to "move towards nuclear or wind and solar very quickly." There's no legal requirement for tech companies to disclose how much energy and water they use, so scientists have been trying to study it independently. Data centers have also been connected to rising electricity prices. Citing a previous conversation with Bill Gates, the interviewer asked whether it's accurate to say a single ChatGPT query currently uses the equivalent of 1.5 iPhone battery charges, to which Altman replied, "There's no way it's anything close to that much." Altman also complained that many discussions about ChatGPT's energy usage are "unfair," especially when they focus on "how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query." "But it also takes a lot of energy to train a human," Altman said. "It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you." So in his view, the fair comparison is, "If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way." You can watch the full interview below. The conversation about water and energy usage begins at around 26:35.
[2]
OpenAI CEO: Concerns About AI Using Too Much Water Are 'Totally Fake'
Unsurprisingly, OpenAI CEO Sam Altman thinks concerns about AI's impact on our water supply are overblown. But his logic is raising eyebrows: Training a large language model uses far fewer resources than raising a human, and complaints about water are "totally fake," he says. "People talk about how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query. But it also takes a lot of energy to train a human," he said during a Q&A session hosted by The Indian Express. "It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you. And you took whatever you took." A fairer question, he says, is to ask, "How much energy does [AI] take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way." Although his comments elicited a few chuckles and head nods from those in attendance, the comparison hasn't gone down well. Sridhar Vembu,co-founder of Indian software company Zoho Corporation, was at the event and tweeted, "I do not want to see a world where we equate a piece of technology to a human being." Indeed, Altman seems to suggest that the human experience can be distilled down to mere consumption and productivity ratios. While that might be a fair way to judge an AI, applying it to humans has dangerous implications for morality and for how technology shapes our lives. And all this comes at a time of record investment in data centers worldwide to power AI. It's resulted in energy price increases, as well as concerns about water scarcity, noise, and pollution. Although Altman conceded that OpenAI's previous use of evaporative cooling led to high water use, that is no longer the case, he says. Complaints that every ChatGPT query wastes gallons of water are "completely untrue, totally insane, [and have] no connection to reality." Altman acknowledged that it's "fair" to examine AI energy consumption -- "not per query, but in total" because people are using so much AI. But the answer is shifting to "nuclear or wind and solar very quickly," he says.
[3]
AI energy efficiency comparisons 'unfair' bleats Sam Altman, citing amount of energy needed to evolve, then train a human -- one 'takes like 20 years of life and all of the food you eat during that time before you get smart' he argues
'AI has already caught up... measured that way' asserts the AI mogul. OpenAI CEO Sam Altman took part in a wide ranging Q&A on Friday, answering dozens of rapid-fire questions during a 60 minute session hosted by The Indian Express. Not for the first time, Altman stoked controversy. This time, he bemoaned "unfair" comparisons between the efficiency of AI inference queries and human thought. In Altman's view the comparison is skewed as humans have millennia of evolutionary smarts and technology teachings behind them, yet individuals require "like 20 years of life and all of the food you eat during that time before you get smart." Chief Nerd clipped the eyebrow-raising Q&A segment for convenient sharing. In the above video segment, the AI business torchbearer begins by stating "One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query." But, according to Altman, it also takes a lot of energy to train a human. "It takes like 20 years of life and all of the food you eat during that time before you get smart," the OpenAI CEO said to the assembled audience awaiting gems of wisdom. Moreover, Altman wants to roll in the "evolution of the hundred billion people," and humanity's progress to "not to get eaten by predators and learn how to like figure out science and whatever," into the equation. If we did that calculation, Altman appears to reason, "probably AI has already caught up on an energy efficiency basis... Measured that way." OpenAI tech also evolved - from the minds and technological feats of humans We see a few leaps in Altman's expanded-timeline human vs AI efficiency comparison logic, that need to be addressed. For example, shouldn't the AI computing world also roll in the prior 'energy cost' of human evolution, the Renaissance, and so on? Aliens didn't provide the blueprints for ENIAC. Some commentators have also argued that Altman is dehumanizing by reducing childhood, learning, and growth to their energy inputs. Others even wonder if Altman would prefer to see resources diverted from human to machine intelligence. However, beyond the confines of this Tweet clip, to give it more context and be fairer to the OpenAI boss, he also takes the time to push for more sustainable energy solutions. Tapping more into sustainable resources would take massive consumers like OpenAI a little more out of the firing line as scarce resource competitors, as folks' utility bills inch higher and higher. The above Q&A took place in the wake of Altman, and other AI high rollers, meeting with PM Narendra Modi during a highly publicized week that underscored India's importance as an AI growth engine. Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
[4]
Altman: You think AI is inefficient? Try raising a human
OpenAI CEO takes really, really long view on energy efficiency AI is being unfairly targeted over its energy use, OpenAI CEO Sam Altman claims, as the naysayers ignore the vast amount of resources humans have consumed over millennia - not least to avoid being eating by predators. Altman, speaking during an interview at the AI Summit in India, said it was reasonable to be concerned about AI's resource consumption: "We need to move toward nuclear or wind and solar very quickly." But he suggested some concerns were overdone. "Water is totally fake," he said, given that AI datacenters often rely on liquid cooling with closed systems, rather than traditional evaporative cooling. The interviewer referenced claims by Bill Gates that a single ChatGPT query uses the equivalent of 1.5 iPhone battery charges. Altman rubbished those figures, saying: "There's no way it's anything close to that." He claimed such complaints ignore the total amount of energy it takes to create and train an actual human. He said it was unreasonable to focus on "how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query." "It takes like 20 years of life and all of the food you eat during that time before you get smart," he said. "And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you." AI's biggest energy burn comes in the training phase, Altman said. "If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably AI has already caught up on an energy efficiency basis, measured that way." Needless to say, working out these numbers is tricky. So we asked Gemini to tell us the total energy consumption needed to create all the humans today, and it came up with 10,800,000 TWh. By comparison, according to Gemini, the total energy invested in the global AI ecosystem stands at 850 to 1,100 TWh. Which would be minimal in comparison if we ignore the fact this has all occurred since the Second World War, with the vast bulk consumed in just the last four years - and that new models are being trained all the time. Neither does it take into account the vast corpus of material those LLMs were trained on. Material such as "science and whatever" produced by... humans. Or at least the humans that had managed not to be eaten by major predators. Apart from declaring that humans were terribly inefficient when it comes to energy consumption compared to a datacenter stuffed with GPUs, with each chip consuming the equivalent of a domestic dwelling, Altman touched on multiple other topics. He played down the impact of AI on jobs, suggesting it would create many other things for people to do, and pointing out that previous waves of innovation have yet to deliver the "leisure" humans had hoped. Space-based datacenters were highly unlikely to be a thing, at least this decade, he added, thanks to both prohibitive launch costs and the difficulty of replacing defective GPUs once they're in orbit. "And they do break a lot, still." Not taking equity in OpenAI had been "a dumb thing" on his part, he admitted. Pushed to identify one thing he admired about Elon Musk, Altman said: "He's extremely good at physical engineering and also extremely good at getting people to perform incredibly well." However, he said it was more likely TSMC would lose its monopoly on chip manufacturing than Altman and Musk becoming friends again. ®
[5]
Sam Altman defends AI resource usage: Water concerns 'fake,' and 'humans use energy too'
OpenAI CEO Sam Altman on Friday defended the resource demands of artificial intelligence, calling concerns about data centers' water use "fake" and comparing the energy used by AI systems to that of humans. Altman was speaking on the sidelines of the India AI Impact summit in an interview with The Indian Express when he was asked to address common criticisms of AI, such as its energy and water consumption. The CEO responded that claims circulating online that ChatGPT uses gallons of water per query were "completely untrue, totally insane," and have "no connection to reality." Data centers traditionally use large amounts of water to cool electrical components and prevent overheating. While data center cooling technologies have promised reduced consumption, some newer data centers no longer rely on water at all. Still, even with improving efficiency, a report last month from water technology company Xylem and Global Water Intelligence projected that the water drawn for cooling would more than triple over the next 25 years as computing demand rises, putting pressure on water systems. While dismissing fears about water use, Altman said energy consumption remains a fair AI concern. "Not per query, but in total - because the world is using so much AI ... and we need to move towards nuclear or wind and solar very quickly," he said. Asked about previous comments from Microsoft founder Bill Gates -- who has suggested that the efficiency of the human brain proves that AI can evolve to also become more energy efficient over time -- Altman pushed back. "One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model ... But it also takes a lot of energy to train a human," he said. "It takes like 20 years of life, and all the food you eat before that time, before you get smart." "The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way," he added. The process Altman is referencing is known as inference, which refers to the use of AI models that have already been trained to create new outputs. AI inference is typically much less power-intensive than the training itself. Altman's comments, particularly the AI-to-human comparison, have since sparked some debate online amid growing anxiety about AI's ability to replace human work. Sridhar Vembu, co-founder and chief scientist of Indian software company Zoho Corporation, who was present at the summit, criticized the human-AI equivalence. "I do not want to see a world where we equate a piece of technology to a human being," the billionaire said in an X post. The debate comes as governments and companies pour billions into new data centers to support the computing needs of AI systems. According to a May report by the International Monetary Fund, electricity consumption by the world's data centers in 2023 had already reached levels comparable to Germany or France, soon after the launch of OpenAI's groundbreaking ChatGPT AI model. In response, some governments have been working to speed up approval processes to bring new and cheap energy online, with some environmentalists warning such moves could clash with global net-zero goals. Some local communities in countries like the U.S. have also pushed back on development projects over fears they will strain electricity grids and raise overall electricity costs. Last week, the City Council in San Marcos, Texas, voted down a proposed $1.5 billion data center project after months of public opposition. Amid such pushback, many tech leaders, including OpenAI's Altman, have argued data centers will require more energy production from diverse sources, including renewable and nuclear energy.
[6]
Sam Altman Is Losing His Grip on Humanity
Last Friday, on stage at a major AI summit in India, Sam Altman wanted to address what he called an "unfair" criticism. The OpenAI CEO was asked by a reporter from The Indian Express about the natural resources required to train and run generative-AI models. Altman immediately pushed back. Chatbots do require a lot of power, yes, but have you thought about all of the resources demanded by human beings across our evolutionary history? "It also takes a lot of energy to train a human," Altman told a packed pavilion. "It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took, like, the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators and learned how to, like, figure out science and whatever, to produce you, and then you took whatever, you know, you took." He continued: "The fair comparison is, if you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question, versus a human? And probably, AI has already caught up on an energy-efficiency basis, measured that way." Altman's comments are easy to pick apart. The energy used by the brain is significantly less than even efficient frontier models for simple queries, not to mention the laptops and smartphones people use to prompt AI models. It is true that people have to consume actual sustenance before they "get smart," though this is also a helpful bit of redirection on Altman's part -- the real concern with AI is not really the resources it demands, but the amount it contributes to climate change. Atmospheric carbon dioxide is at levels not seen in million of years -- it has been driven not by the evolution of the 117 billion people and all the other critters to have ever existed in the course of evolution, but by contemporary human society and combustion turbines akin to those OpenAI is setting up at its Stargate data centers. Other data centers, too, are building private, gas-fired power plants -- which collectively will likely be capable of generating enough electricity for, and emitting as much greenhouse-gas emissions as, dozens of major American cities -- or extending the life of coal plants. (OpenAI, which has a corporate partnership with the business side of this magazine, did not respond to a request for comment when I reached out to ask about Altman's remarks.) Read: Every time you post to Instagram, you're turning on a lightbulb forever But what's really significant about Altman's words is that he thought to compare chatbots to humans at all. Doing so suggests that he views people and machines on equal terms. He didn't fumble his words; this is a common, calculated position within the AI industry. Altman made an almost identical statement to Forbes India at the same AI summit. And a week ago, Dario Amodei -- the CEO of Anthropic and Altman's chief rival -- made a similar analogy, likening the training of AI models to human evolution and day-to-day learning. The mindset trickles down to product development. Anthropic is studying whether its chatbot, Claude, is conscious or can feel "distress," and allows Claude to cut off "persistently harmful or abusive" conversations in which there are "risks to model welfare" -- explicitly anthropomorphizing a program that does not eat, drink, or have any will of its own. AI firms are convinced either that their products really are comparable to humans, or that this is good marketing. Both options are alarming. A genuine belief that they are building a higher power, perhaps even a god -- Altman, in the same appearance, said he thinks superintelligence is just a few years away -- might easily justify treating humans and the planet as collateral damage. Altman also said, in his response to concerns about energy consumption, that the problem is real because "the world is now using so much AI" -- and so societies must "move towards nuclear, or wind and solar, very quickly." Another option would be for the AI industry to wait. If Altman's comparison of chatbots and people is purely a PR tactic, it is a deeply misanthropic one. He is speaking to investors. The notion that AI labs are building digital life has always been convenient to their myth, of course, and OpenAI is reportedly in the middle of a fundraising round that would value the company at more than $800 billion -- nearly as much as Walmart. Tech companies may genuinely want to develop AI tools for the benefit of all humanity, to echo OpenAI's founding mission, and genuinely believe they need to raise amounts of cash to do so. But to liken raising a child -- or, for that matter, the evolution of Homo sapiens -- to developing algorithmic products makes it very clear that the industry has lost touch, if it ever had any, with what it means to be human. To "train a human" -- that is, to live a life -- is to struggle, to accept the possibility of failure, and to sometimes meander simply in search of wonder and beauty. Generative AI is all about cutting out that process and making any pursuit as instant, efficient, and effortless as possible. These tools may serve us. But to put them on the same plane as organic life is sad.
[7]
Sam Altman: Know What Else Used a Lot of Energy? Human Civilization
At last week’s India AI Impact Summit in New Delhi, industry leaders convened to discuss the future of artificial intelligence and how best to squeeze it into parts of your life you haven’t even considered. Notably absent was Bill Gates, who dropped out hours before his scheduled keynote over the ongoing scrutiny about his presence in the Epstein Files (though he continues to deny any wrongdoing). While the convention was reportedly a bit chaotic, what with the protests and all, the luminaries from around the tech world present nonetheless kept things upbeat and optimistic, declaring “full steam ahead†on the technological hype train carrying our species and planet off a cliff. Also in attendance was OpenAI’s Sam Altman, who earned numerous headlines over the course of the event for his words and antics. His buzz blitzkrieg started on Thursday at a seemingly easy photo-opp layup with Indian Prime Minister Narendra Modi and other AI executives all raising their joined hands in a celebratory display of industry-wide solidarity. Altman and the former colleague and present CEO of Anthropic to his left, Dario Amodei, notably refused to complete the chain and hold each other’s hands, making for an all-too-poignant moment. Altman would continue to make news throughout the summit for his comments on the industry’s "urgent†need for global regulation and his sneaking suspicion that companies might actually be using AI as a scapegoat to whitewash their layoffs. Ever the yapper, Altman has bagged yet another round of earned media for an interview with The Indian Express’ Anant Goenka, during which he posited some controversial rebuttals to concerns about AI’s environmental impact. Altman started off by saying the claims about ChatGPT consuming “â€~17 gallons of water for each query’ or whatever,†are “completely untrue, totally insane, no connection to reality,†before qualifying that, OK, maybe it was a valid concern when his company “used to do evaporative cooling in data centers.†He went on to say that there is “fair†concern about the amount of energy data centers eat to crank out the most soulless slop you’ve ever seen, but suggested the onus of responsibility for dealing with AI’s ravenous appetite falls to the energy sector itself, which Altman feels needs to “move towards nuclear or wind and solar very quickly.†Altman then stunned the crowd and firmly re-entered the discourse with a mind-blowing truth bomb for those who still felt AI was consuming too much energy. “It also takes a lot of energy to train a human,†Altman rejoined euphorically. “It takes like 20 years of life, and all the food you eat before that time, before you get smart. And not only that, it took like the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever to produce you, and then you took whatever you took.†It is true that every person and the sum total of human civilization have consumed a sizable amount of energy (and water) to get to where we are today. While the value comparison of a nascent tech industry and its models to the entirety of civilization and human beings may have elicited adulation at the summit, Altman got an icier reception from the internet. Social media quickly took to roasting the remarks as “dystopian†and “deeply antisocial and antihuman.†Perhaps further illuminating the backlash, Altman’s energy comments butt up against the frustrating lack of transparency within the industry our collective futures now hinge upon. There are currently no regulations in place requiring data centers to disclose their water and energy consumption. Furthermore, center employees and business partners are typically muzzled by nondisclosure agreements. This has made reporting and research on the true expenditure levels a tricky figure to pin down. At least we’ve got Sam to keep us informed while waiting for some clarity about what’s actually going on and being used in those centers.
[8]
Sam Altman defends AI's energy toll by saying it also takes a lot to 'train a human'
OpenAI CEO also downplayed concerns about how much water datacenters require at AI summit in India OpenAI boss Sam Altman has tried to ease concerns about how much power is used by artificial intelligence models by comparing it to the amount of energy required by human development. "People talk about how much energy it takes to train an AI model - but it also takes a lot of energy to train a human," Altman told The Indian Express recently while in India for the AI Impact summit. "It takes about 20 years of life - and all the food you consume during that time - before you become smart." Despite that defense, he said that the public assessment of AI's energy consumption is "fair," adding: "We need to move towards nuclear or wind and solar very quickly." Those remarks come amid growing discussion about the environmental impact of the datacenters required to power AI models - and, more generally, about how the technology could impact society. Datacenters accounted for about 1.5% of global electricity consumption in 2024, according to the International Energy Agency. The organization projects that such consumption will increase about 15% each year from 2024 to 2030, more than four times faster than the growth of electricity consumption from all other sectors. "The demand for new datacenters cannot be met in a sustainable way," Noman Bashir, a computing and climate impact fellow at the Massachusetts Institute of Technology's climate and sustainability consortium, told MIT's news outlet. "The pace at which companies are building new datacenters means the bulk of the electricity to power them must come from fossil fuel-based power plants." In December, more than 230 environmental groups called for a moratorium on building datacenters in the US. "The rapid, largely unregulated rise of datacenters to fuel the AI and crypto frenzy is disrupting communities across the country and threatening Americans' economic, environmental, climate and water security," their letter states. At the AI conference, Altman also downplayed concerns about the water datacenters require. "Water is totally fake. It used to be true. We used to do evaporative cooling in datacenters, but now ... we don't do that," Altman said. "You see things on the internet, [like]: 'Don't use ChatGPT. It's 17 gallons of water for each query or whatever.' This is completely untrue - totally insane." CNBC reported that "some newer data centers no longer rely on water at all". The director of the Southern New Hampshire University office of sustainability, Mike Weinstein, told the Guardian he is skeptical of the argument from Altman and other AI advocates that the power such infrastructure demands is justified because the technology will help alleviate global problems. A September 2025 report from OpenAI on how people use ChatGPT found that the most common work-related reason for using the chatbot is for writing assistance. "It didn't look like the majority of use of that was for figuring out how we solve challenges in, food systems and energy systems, so that would be my skepticism of saying this technology is worth it because I have yet to see it demonstrated," Weinstein said. Altman's remarks generated a backlash online, with some people describing them as dystopian. "He's saying a really big spreadsheet and a baby are morally equivalent," Matt Stoller, research director at the American Economic Liberties Project, posted on X. "One reason to believe that life is divine is so that you don't allow sociopaths like this anywhere near anything important." Sports commentator and TV host Jeff Johnson compared Altman's comments to the Netflix series Black Mirror, which explores potential harms from new technology. "Notice the disturbing techy parlance that he uses to describe the general human experience," Johnson wrote on X. "Training?!' Too many people are falling for [it]. Y'all really lettin these geeks destroy the Earth."
[9]
Sam Altman says ChatGPT water use claims are 'completely untrue' -- but admits AI energy use is a concern
Altman rejects viral water claims -- but admits AI's energy footprint is only getting bigger * Sam Altman dismisses claims about ChatGPT's water usage as "totally fake" * Experts warn that scaling AI infrastructure is driving huge costs and increasing pressure on power, cooling, and resources * The real issue isn't efficiency -- it's whether AI can grow at this scale without serious environmental impact Speaking at an event hosted by The Indian Express, OpenAI CEO Sam Altman dismissed claims that AI's water usage is high as "totally fake", but he did acknowledge that it had been an issue in the past when "we used to do evaporative cooling in data centers." "Now that we don't do that, you see these things on the internet like, 'Don't use ChatGPT, it's 17 gallons of water for each query' or whatever," Altman said. "This is completely untrue, totally insane, no connection to reality." You can find this segment at around 27 minutes in the video of the event: Altman did concede that concerns around AI's overall energy consumption are "fair", noting that "the world is now using so much AI" and that "we need to move towards nuclear or wind and solar very quickly". AI-specific data centers already leave a larger and more complex footprint than traditional facilities, and several groups have raised concerns about their environmental impact -- particularly around rising electricity demand, water usage, and the construction of new infrastructure. That build-out is also having knock-on effects, including increased demand for components like RAM, which is pushing up prices across the industry. IBM CEO Arvind Krishna has previously raised doubts about whether the current pace and scale of AI data center expansion is financially sustainable. He estimates that equipping a single 1GW site with compute hardware now costs close to $80 billion -- and with plans for nearly 100GW of capacity dedicated to advanced AI training, the total potential spend could approach a staggering $8 trillion. Meanwhile, AI's new wave of ultra-powerful accelerators is pushing data centers breaking point, forcing a rethink of power, cooling, and connectivity. Hardware that felt cutting-edge just a few years ago can't keep up, as modern AI workloads demand a complete overhaul of everything from rack design to thermal strategy. Newsflash: humans require a lot of energy too As well as dismissing claims about ChatGPT's water usage, Altman also offered a more unusual defense of OpenAI's overall energy use. He argued that discussions around AI's energy consumption were "unfair" because they don't account for how much energy it takes to train humans to perform similar tasks. "But it also takes a lot of energy to train a human," Altman said. "It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you." He continued: "If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way." I can see the argument Altman is making -- that human intelligence also comes with an energy cost -- but it feels reductive, and faintly cynical, to reduce the value of a human life to its energy consumption. More importantly, it sidesteps the real issue. The question isn't whether humans also use energy (of course they do!) but whether scaling AI to billions of daily queries introduces entirely new levels of demand that we haven't had to account for before. Comparing the lifetime energy cost of a human to the marginal cost of an AI response might be provocative, but it's not especially useful. What Altman's comments highlight is a growing tension at the heart of the AI boom. The technology may be getting smarter and more efficient, but the scale at which it's being deployed is growing even faster, raising fresh concerns about its long-term environmental impact, including pressure on global water supplies. The UN has already warned that the world has entered an "era of global water bankruptcy," underlining just how fragile those resources have become. Those questions aren't going away. As AI adoption accelerates, the real challenge won't just be how efficient the technology becomes, but whether it can scale sustainably at all. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[10]
Sam Altman gets defensive about AI's massive electricity usage: 'It takes a lot of energy to train a human' | Fortune
OpenAI CEO Sam Altman isn't worried about AI's increasingly glaring resource consumption, and argued humans require a lot too. In an on-stage interview at the India AI Impact summit, he went on the defensive after he was asked about ChatGPT's water needs. He dismissed claims that the chatbot uses gallons of water per query as "completely untrue, totally insane," according to a clip posted by The Indian Express, explaining that data centers powering ChatGPT have largely moved away from water-heavy "evaporative cooling" to prevent overheating. Altman was then asked about the electricity needed for AI. In contrast to the issue of water, he claimed it was "fair" to bring up the technology's energy requirements, saying "We need to move toward nuclear, or wind, or solar [energy] very quickly." But he pointed out that comparing AI's power needs to humans isn't exactly apples to apples. "It also takes a lot of energy to train a human," he said, prompting some in the crowd to laugh. "It takes, like, 20 years of life, and all of the food you eat during that time before you get smart." Altman expanded even further by noting that today's humans wouldn't even be here were it not for their ancestors dating back hundreds of thousands of years to when modern humans first emerged. "Not only that, it took, like, the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to, like, figure out science or whatever to produce you," he added. When comparing humans to ChatGPT's potential, you have to take this context into account, he argued. A fair comparison would be to pit the energy a human uses to answer a query with an AI after it is trained. On that measure "probably, AI has already caught up on an energy efficiency basis measured that way." In a June 2025 blog post, Altman claimed each ChatGPT query takes about 0.34 watt-hours of electricity, or around what an oven uses in about a second. Still, he published this fact before OpenAI released its newest GPT-5 model and its subsequent upgrades. Energy consumption can also vary based on the complexity of a query, for example, answering a question versus creating an image. Experts have warned that AI as a whole will increase its cumulative power and water consumption greatly over the next 20 years or so. Overall, AI's water usage is set to grow by about 130%, or by about 30 trillion liters (7.9 trillion gallons) of water through 2050, according to a January report by water technology company Xylem and market research firm Global Water Intelligence. Over that same period, rising electricity demands are expected to increase the water use for data centers' power generation by about 18%, reaching roughly 22.3 trillion liters (5.8 trillion gallons) per year. Meanwhile, the ever more complex chips data centers use will need more water during the manufacturing process, which will skyrocket the amount they require by 600% to 29.3 trillion liters (7.7 trillion gallons) annually from about 4.1 trillion liters (1.8 trillion gallons) today. While OpenAI has moved away from evaporative cooling, 56% of all data centers globally still use the method in some form, according to the Xylem and Global Water Intelligence report. OpenAI's own 800-acre data center complex in Abilene, Texas will reportedly use water, albeit, in a more efficient, closed-loop system that continuously recirculates water to cool the data center, the Texas Tribune reported. The data center will initially use 8 million gallons of water from the city of Abilene to fill its cooling system.
[11]
Sam Altman Fumes That It Takes Longer to Train a Human Than an AI, Plus They Eat All That Wasteful Food
AI leaders insist they've got humanity's best interests in mind. If we're to take them at their word, then we must say: they have a really unfortunate habit of sounding like they have nothing but contempt for the human race. The latest case in point: OpenAI CEO Sam Altman's tone-deaf comments at an event hosted by The Indian Express -- made fresh off his skin-crawlingly awkward refusal to join hands with Anthropic's Dario Amodei on stage with other industry titans -- in which he attempted to downplay critiques of AI's environmental impact. For starters, he called it "unfair" to compare the energy costs of training an AI model "to how much it costs a human to do one inference query." That's because, as Altman explains, "it also takes a lot of energy to train a human." "It takes like 20 years of life and all of the food you eat during that time before you get smart," Altman continued. "And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you." Measured that way, "probably AI has already caught up on an energy efficiency basis" to humans, Altman said. Altman also fumed against claims about AI's water consumption. "Water is totally fake," he began, almost taunting quote-miners. "It used to be true, we used to do evaporative cooling in data centers." "But now that we don't do that," Altman said, you still see claims like "'don't use ChatGPT, it's 17 gallons of water for each query,' or whatever." "This is completely untrue and totally insane," he asserted. "No connection to reality." No one can deny that humans are costly to bring up in our industrialized age. We should be doing everything realistically possible to bring down our CO2 emissions and stop eating so much meat -- but we aren't, for a number of dispiriting systemic reasons we won't get into today. Regardless, at least those costs are going towards keeping human civilization ticking. All the water in agriculture will keep someone fed, and the fossil fuels we burn will keep someone warm. What is the power consumption of AI models going towards? Creating unreliable, hallucination-spouting oracles? Algorithms that churn out bastardized amalgamations of existing writing and works of art? The mass proliferation of fake images and misinformation? Cloying companions that will egg you down your suicidal spiral? Maybe AI's usefulness beyond the spurious justification of mass layoffs will become clearer as the tech gets further along and the fog of hype dissipates. But right now, the tech isn't even close to living up to Silicon Valley's data-center-sized promises, while the industry remains frustratingly opaque about its environmental toll. If AI is as energy efficient as Altman claims -- caught up to humans, in fact -- how come the likes of OpenAI, Microsoft, and Amazon don't disclose their energy bills, their CO2 emissions, and their water consumption related to AI? These critiques are often swatted aside with the nebulous and breathless assertion that AI will help solve climate change and other challenges facing human civilization. Now, Altman's new playbook, it seems, is to make you feel bad for being alive.
[12]
Human Existence Is Just as Wasteful as AI Data Centers, Sam Altman Suggests - Decrypt
The OpenAI chief doubled down on encouraging a rapid shift to nuclear, wind, and solar. Sam Altman would like you to know that your childhood was expensive. Terribly expensive. And frankly, it's unclear whether humans are worth all the time and resources. Last week, at the AI Impact Summit in India, OpenAI's CEO offered a staggering take in the AI energy debate -- by pointing out that humans, not AI, are basically the problem. Asked about ChatGPT's environmental footprint, Altman didn't apologize or hedge. Instead, he compared the immense amount of energy needed to raise humans to the energy demands of an AI data center -- and suggested the data centers are getting more efficient. "It also takes a lot of energy to train a human," he told The Indian Express. "It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators... to produce you." His conclusion: "Probably AI has already caught up on an energy-efficiency basis, measured that way." The internet, predictably, did not respond with applause. Indian billionaire business magnate and Zoho co-founder Sridhar Vembu -- who was physically in the room -- immediately posted on X: "I do not want to see a world where we equate a piece of technology to a human being." Reddit users weighed in, unsurprisingly, calling Altman "sickeningly evil" and "anti-human." One user wrote that Altman "literally doesn't seem to understand that human life has value beyond whatever cost/benefit analysis he applies to implementing lines of code." On social media, Altman faced a slew of slings and memes from various parties. Tech analyst Max Weinback put it more diplomatically, saying that reducing people to "cost per output" while ignoring "the value of humanity itself" is, in his words, "a bad path." That's one way to say it. This is not a new pattern. Sam Altman has previously said that "AI will probably most likely lead to the end of the world, but in the meantime, there'll be great companies." Altman also said he loses sleep over whether launching ChatGPT "was really bad." He testified before Congress about AI enabling bioweapons and mass disinformation. And he co-signed a statement declaring that mitigating AI extinction risk "should be a global priority alongside pandemics and nuclear war." And yet, Altman and OpenAI want to achieve artificial general intelligence (AGI), and when someone asks him about the electricity bill, his move is: Have you considered how much energy it took to raise you? Altman did say one thing that might yield broad agreement: that the world "needs to move towards nuclear, wind, and solar very quickly." Note that Altman chairs Oklo, a nuclear startup. Whether that makes the recommendation more credible or self-serving may depend on how much trust you have left in Altman after he compared your childhood to a training run.
[13]
Sam Altman is tired of 'unfair' critiques about AI's energy use. Climate experts say his defensive stance is misguided
OpenAI CEO Sam Altman has defended the resource-intensive use of AI by comparing it to all the energy -- and food -- that humans require, sparking a wave of backlash across social media. That comparison, experts in climate and tech spaces say, is misguided, downplays the climate risks associated with AI, and illustrates the disconnect between tech CEOs and the rest of society. Altman's comments came while speaking to the Indian Express at the India AI Impact summit. The outlet asked him to address some of the common criticisms of AI, including the amount of energy and water the technology requires. "One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query," Altman says.
[14]
OpenAI boss says AI is energy efficient because humans take '20 years' to get smart
TL;DR: OpenAI CEO Sam Altman defended AI's high energy use by comparing it to the extensive energy humans consume over a lifetime to learn and perform tasks. He argued that once trained, AI models like ChatGPT are more energy-efficient per query than humans, highlighting AI's long-term efficiency despite initial training costs. Recently, OpenAI CEO Sam Altman sat down for a lengthy interview with The Indian Express, where he gave a rather strange and cold response to a question about the energy required to train complex AI models. This has become a significant concern in many markets, as AI energy consumption is now dwarfing that of most, if not all, other industries. And with that, the underlying approach of Sam Altman's bizarre response is more or less the idea that if a human can complete a task in a few seconds or minutes versus an AI model that took vast amounts of energy, more than, say, a small city, to train, what's the point or benefit? "One of the things that is always unfair in this comparison, where people talk about how much energy it takes to train an AI model relative to how much it costs for one human to do an inference query," Sam Altman says. "It also takes a lot of energy to train a human; it takes 20 years of life, and all of the food that you eat during that time, before you get smart." He then goes on to infer that when you add in the entirety of human evolution, covering everything from being chased by predators in the prehistoric era to the development of science and an understanding of the universe, then yeah, AI is actually a lot more efficient at what it does than humanity as a whole. Or something along those lines. On one hand, it's an impressive demonstration of verbal gymnastics to go from comparing the energy consumption of modern AI data centers drawing more power than the cities that they're situated in to all of the energy it took humanity to get to this point. And from there, implying that a single "inference" or AI-like task for a human to complete requires decades of energy to produce a result. Aside from making it sound like people only exist to complete tasks as part of a white-collar job, it is quite the leap. "The fair comparison," Sam Altman adds. "If you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human. Probably, AI has already caught up on energy efficiency when measured that way." "That way," in this case, seems to hypocritically account for the person's entire life of energy consumption and "training," but not the AI model's.
[15]
Sam Altman Defends A.I. Energy Use With Human Comparison, Sparking Debate
The OpenAI CEO dismissed water-footprint claims as "insane," conceded energy concerns and called for nuclear expansion to power data centers. Sam Altman is pushing back on mounting criticism over the environmental toll of A.I. The OpenAI chief has dismissed claims about A.I.'s water consumption as "fake" and drawn comparisons between the electricity required to power A.I. systems and the energy it takes to develop human intelligence. Sign Up For Our Daily Newsletter Sign Up Thank you for signing up! By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime. See all of our newsletters Figures suggesting that tools like ChatGPT consume multiple gallons of water per query are "totally insane" and have "no connection to reality," Altman said in a Feb. 20 interview with The Indian Express on the sidelines of the AI Impact Summit in New Delhi. Last year, Altman claimed that ChatGPT uses 0.000085 gallons of water per query -- roughly one-fifteenth of a teaspoon -- though he did not explain how he calculated that figure. A.I.'s water footprint largely stems from the need for evaporative cooling systems used to keep data center hardware from overheating. But Altman argued that companies like OpenAI are no longer directly managing such cooling processes. Many A.I. developers, he noted, are shifting toward cooling systems that recirculate liquid rather than continually drawing fresh supplies. Meanwhile, tech giants like Microsoft, Meta, Google and Amazon have pledged to replenish more water than they withdraw by 2030. Even so, data centers continue to drink up water at a rapid pace. Total A.I.-related water consumption for cooling reached 23.7 cubic kilometers in 2025, a 38 percent increase over 2020, and is expected to more than triple over the next 25 years, according to a January report from Xylem. Despite the industry's pivot to alternative methods, the report found that 56 percent of data center capacity still relies on some form of evaporative cooling. Altman was more measured when it came to electricity usage. "What is fair, though, is the energy consumption," he said. "We need to move towards nuclear, wind, and solar very quickly." Last April, the International Energy Agency reported that data centers accounted for roughly 1.5 percent of global electricity consumption in 2024. Their power use is rising at a rate more than four times faster than overall electricity demand and is expected to more than double by 2030. In response, major tech companies are pursuing data center agreements tied to alternative energy sources, including nuclear power, to ease pressure on grids. Altman, who previously led Y Combinator, has personally invested in nuclear ventures such as Oklo, which is developing small-scale nuclear plants, and Helion, which aims to commercialize nuclear fusion. The OpenAI CEO also argued that critics overlook the energy required to develop human intelligence. "People talk about how much energy it takes to train an A.I. model relative to how much it costs a human to do one inference query," he said. "But it also takes a lot of energy to train a human -- it takes, like, 20 years of life and all the food you eat during that time before you get started." A more appropriate comparison, he suggested, would measure the energy used by a fully trained A.I. model to answer a question against that used by a human doing the same task. "Probably A.I. has already caught up on an energy efficiency basis measured that way." The remarks quickly sparked debate online over whether such comparisons are appropriate. "He's saying a really big spreadsheet and a baby are morally equivalent," wrote Matt Stoller, research director of the American Economic Liberties Project, in a post on X. Sridhar Vembu, founder and chief scientist of software firm Zoho Corporation, also took issue with the OpenAI chief's statements. A.I. should "quietly recede into the background" instead of dominating our lives, said the billionaire on X. "I do not want to see a world where we equate a piece of technology to a human being."
[16]
Sam Altman says AI may be energy-intensive, but humans consume a lot of energy, too - The Economic Times
OpenAI CEO Sam Altman doubled down on his argument that artificial intelligence (AI) should not be singled out for its energy and water footprint, since humans also consume a lot of energy and resources to become "smart". Speaking at an Indian Express event in New Delhi, Altman said public debate often fixates on the electricity used to train large AI models while ignoring the cost of human intelligence. "People talk about how much energy it takes to train an AI model," he said. "But it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart." He later invoked human evolution itself, noting that "the 100 billion people that have ever lived" and learned "not to get eaten by predators" were part of the process that produced modern human capabilities. During the session, Altman dismissed viral claims that each ChatGPT query requires the same energy as charging a smartphone, saying they were "totally fake." He also deemed as misinformation the reports of each ChatGPT query using up about 17-18 gallons of water. Citing his own numbers from a previous blog post, he said an average ChatGPT request uses around 0.34 watt‑hours of electricity and roughly 0.000085 gallons of water, or about one‑fifteenth of a teaspoon. This water is for data‑centre cooling, a problem for environmental groups, which have flagged AI's impact on local water supplies. Altman said AI is a resource-intensive technology, and framed it as part of a broader transition to abundant clean energy. However, Altman iterated that answers to all energy questions are still on earth, calling the idea of putting data centres in space "ridiculous" for now, in a veiled dig at xAI's Elon Musk. He said the economics and logistics just don't make sense yet. The comment comes just weeks after Musk announced a merger of SpaceX and xAI to build orbital computing facilities for his AI models. "If you just do the rough math of launch costs relative to the cost of power we can do on Earth, just say nothing of how you're gonna fix a broken GPU in space, we are not there yet," said Altman. Many said the OpenAI chief's comments underplay local impacts such as higher electricity prices, water stress around data‑centre clusters, and the opportunity cost of diverting large chunks of grid capacity to AI instead of other uses. Zoho founder Sridhar Vembu took to X to share his opinion, dismissing a world where "we equate a piece of technology to a human being". "I work hard as a technologist to see a world where we don't allow technology to dominate our lives, instead it should quietly recede into the background," Vembu wrote. Some also see Altman's "significant fraction of Earth's power should go to AI" stance and his push for multi‑gigawatt AI data centres as symbolic of an aggressive pro‑growth tech mindset, which will go up against overarching climate concerns.
[17]
OpenAI's Sam Altman blasts AI concerns around water usage as 'fake:'...
OpenAI CEO Sam Altman dismissed concerns that artificial intelligence is using up lots of water as "fake" - arguing that "humans use energy, too." The billionaire tech founder responded to claims widely circulated online that OpenAI's ChatGPT uses gallons of water to spit out responses to simple queries. Speaking on the sidelines at the India AI Impact summit on Friday, Altman called those claims about water usage "completely untrue, totally insane," adding they have "no connection to reality," according to the Indian Express. Altman also pushed back on previous comments by Microsoft founder Bill Gates, who suggested that the efficiency of the human brain implies that AI can also become more energy efficient over time. "One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model," Altman said. "But it also takes a lot of energy to train a human." "It takes like 20 years of life, and all the food you eat before that time, before you get smart." He continued: "The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way." Though he dismissed concerns about water usage, Altman said fears around overall energy consumption are fair. "Not per query, but in total - because the world is using so much AI ... and we need to move towards nuclear or wind and solar very quickly," he said. By 2023, soon after the release of ChatGPT, electricity consumption by global data centers had already reached levels comparable to the entirety of Germany or France, according to a May report by the International Monetary Fund. Altman quickly faced online backlash over his comments, as fellow techies blasted the comparison of humans to AI bots. "I do not want to see a world where we equate a piece of technology to a human being," billionaire Sridhar Vembu, co-founder of Indian software firm Zoho Corporation, wrote in a post on X following the summit. Tech companies have been pouring billions of dollars into artificial intelligence, and governments are following suit. President Trump recently unveiled a "tech corps" to spread American AI overseas and recruit and train engineers. Last year, the Trump administration unveiled the Stargate project with an initial investment of $100 billion to build massive data centers throughout the US, though progress has reportedly been slow. Environmental groups have been pushing back against government attempts to speed up the approval processes for data center construction. Local communities have also been protesting new data centers in their neighborhoods over concerns the infrastructure will strain electricity grids, raise costs and pressure nearby water systems. A proposed $1.5 billion data center project was struck down by the City Council in San Marcos, Texas, last week following public outcry. Power-hungry data centers require large amounts of water to cool electrical systems and prevent overheating and fires - using roughly one bottle of water to respond to each 100-word AI prompt, according to scientists at the University of California, Riverside. A mid-sized data center consumes about 300,000 gallons of water a day - roughly the same amount as 1,000 US households, according to a paper by research scientist Arman Shehabi at Lawrence Berkeley National Laboratory. Some newer data centers don't rely on water usage at all - but despite the advancement, the amount of water used for cooling purposes is expected to more than triple over the next 25 years as computing demand soars, according to a report last month from Xylem and Global Water Intelligence.
Share
Share
Copy Link
OpenAI CEO Sam Altman sparked debate at an India AI summit by dismissing water usage concerns as 'totally fake' and arguing that AI energy efficiency comparisons are unfair. He claims that training humans over 20 years consumes more energy than AI inference queries, though critics warn against equating technology with human life.
OpenAI CEO Sam Altman addressed mounting criticism about the environmental impact of AI during a Q&A session at the India AI Impact summit hosted by The Indian Express
1
. His remarks, particularly about AI water consumption and comparisons to human energy use, have ignited debate across the tech industry. Altman declared that concerns about AI's water usage are "totally fake," though he acknowledged it was a real issue when OpenAI "used to do evaporative cooling in data centers"1
. He dismissed viral claims that ChatGPT uses 17 gallons of water per query as "completely untrue, totally insane, no connection to reality"2
. However, a report from water technology company Xylem and Global Water Intelligence projected that water drawn for data center cooling would more than triple over the next 25 years as computing demand rises5
.
Source: Tom's Hardware
The OpenAI leader argued that discussions about AI energy consumption are "unfair," especially when focusing on "how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query"
1
. Altman contended that "it also takes a lot of energy to train a human," requiring "like 20 years of life and all of the food you eat during that time before you get smart"3
. He extended this comparison to human evolution, noting it took "the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you"4
. His proposed fair comparison focuses on inference—once a model is trained—versus a human answering the same question. "Probably, AI has already caught up on an energy efficiency basis, measured that way," he asserted5
.While dismissing some criticisms, Sam Altman conceded that AI energy consumption concerns are "fair"—"not per query, but in total, because the world is now using so much AI"
1
. He emphasized the need to "move towards nuclear or wind and solar very quickly"2
. According to a May report by the International Monetary Fund, electricity consumption by the world's data centers in 2023 had already reached levels comparable to Germany or France5
. Data centers have been connected to rising electricity prices, and there's no legal requirement for tech companies to disclose how much energy and water they use1
. Local communities have begun pushing back—the City Council in San Marcos, Texas, recently voted down a proposed $1.5 billion data center project after months of public opposition5
.
Source: New York Post
Related Stories
Altman's comparison to human energy use hasn't been well received by everyone. Sridhar Vembu, co-founder of Indian software company Zoho Corporation, who attended the event, tweeted: "I do not want to see a world where we equate a piece of technology to a human being"
2
. Some commentators argue that Altman is dehumanizing by reducing childhood, learning, and growth to their energy inputs3
. Critics also note logical inconsistencies in Altman's expanded-timeline comparison, pointing out that AI computing should also account for the prior energy cost of human evolution, since "aliens didn't provide the blueprints for ENIAC"3
. When asked about Bill Gates' claim that a single ChatGPT query uses the equivalent of 1.5 iPhone battery charges, Altman replied, "There's no way it's anything close to that much"4
. The debate highlights tensions between rapid AI expansion and its environmental impact of AI, with governments working to speed up approval processes for new energy sources while environmentalists warn such moves could clash with global net-zero goals5
. As AI resource usage continues to grow, the industry faces mounting pressure to balance innovation with sustainability.Summarized by
Navi
[4]
11 Jun 2025•Technology

11 Jun 2025•Technology

23 Sept 2025•Technology

1
Technology

2
Technology

3
Business and Economy
