5 Sources
5 Sources
[1]
You're Thinking About AI and Water All Wrong
Last month, journalist Karen Hao posted a Twitter thread in which she acknowledged that there was a substantial error in her blockbuster book Empire of AI. Hao had written that a proposed Google data center in a town near Santiago, Chile, would require "more than one thousand times the amount of water consumed by the entire population" -- a figure which, thanks to a unit misunderstanding, appears to have been off by a magnitude of 1,000. In the thread, Hao thanked Andy Masley, the head of an effective altruism organization in Washington, DC, for bringing the correction to her attention. Masley has spent the past several months questioning some of the numbers and rhetoric common in popular media about water use and AI on his Substack. Masley's main post, titled "The AI Water Issue Is Fake," has been linked in recent months by other writers with large followings, including Matt Yglesias and Noah Smith. (Hao said in her Twitter thread that she would be working with her publisher to fix the errors; her publicist told me she was taking time off and was unavailable to chat with me for this story.) When I called him to talk more about AI and water, Masley emphasized that he's not an expert, but "just some guy" interested in how the media was handling this topic -- and how it was shaping the opinions of people around him. "I would sometimes bring up that I used ChatGPT at parties, and people would be, like, 'Oh, that uses so much energy and water. How can you use that?'" he says. "I was a little bit surprised when people would be talking so grimly about just a little bit of water." As local and national opposition to data centers has grown, so, too, have concerns about their environmental impacts. Earlier this week, more than 230 green groups sent a letter to Congress, warning that AI and data centers are "threatening Americans' economic, environmental, climate and water security." The AI industry has started fighting back. In November, the cochairs of the AI Infrastructure Coalition, a new industry group, authored an op-ed for Fox News that touched on environmental worries. "Water usage? Minimal and often recycled -- less than America's golf courses," they wrote. One of the authors of the op-ed, former Arizona senator Kyrsten Sinema, is currently advocating in favor of a data center project in the state that has prompted local pushback, including because of concerns about water use. The coalition also approvingly retweeted a post from Masley on the impact of AI on energy prices. (Masley maintains an exhaustive disclaimer on his Substack refuting allegations that he's being paid by industry to share his opinions.)
[2]
AI's water and electricity use soars in 2025
AI created as much carbon pollution this year as New York City and guzzled up as much H20 as people consume globally in water bottles, according to new estimates. The study paints what's likely a pretty conservative picture of AI's environmental impact since it's based on the relatively limited amount of data that's currently available to the public. A lack of transparency from tech companies makes it harder to see the potential environmental toll of AI becoming a part of everyday tasks, argues the author of the study who's been tracking the electricity consumption of data centers used for AI and crypto mining over the years. "There's no way to put an extremely accurate number on this, but it's going to be really big regardless... In the end, everyone is paying the price for this," says Alex de Vries-Gao, a PhD candidate at the VU Amsterdam Institute for Environmental Studies who published his paper today in the journal Patterns. To crunch these numbers, de Vries-Gao built on earlier research that found that power demand for AI globally could reach 23GW this year -- surpassing the amount of electricity used for Bitcoin mining in 2024. While many tech companies divulge total numbers for their carbon emissions and direct water use in annual sustainability reports, they don't typically break those numbers down to show how many resources AI consumes. De Vries-Gao found a work-around by using analyst estimates, companies' earnings calls, and other publicly available information to gauge hardware production for AI and how much energy that hardware likely uses. Once he figured out how much electricity these AI systems would likely consume, he could use that to forecast the amount of planet-heating pollution that would likely create. That came out to between 32.6 and 79.7 million tons annually. For comparison, New York City emits around 50 million tons of carbon dioxide annually. Data centers can also be big water guzzlers, an issue that's similarly tied to their electricity use. Water is used in cooling systems for data centers to keep servers from overheating. Power plants also demand significant amounts of water needed to cool equipment and turn turbines using steam, which makes up a majority of a data center's water footprint. The push to build new data centers for generative AI has also fueled plans to build more power plants, which in turn use more water and (and create more greenhouse gas pollution if they burn fossil fuels). AI could use between 312.5 and 764.6 billion liters of water this year, according to de Vries-Gao. That reaches even higher than a previous study conducted in 2023 that estimates that water use could be as much as 600 billion liters in 2027. "I think that's the biggest surprise," says Shaolei Ren, one of the authors of that 2023 study and an associate professor of electrical and computer engineering at the University of California, Riverside. "[de Vries-Gao's] paper is really timely... especially as we are seeing increasingly polarized views about AI and water," Ren adds. Across the US, which has more of these facilities than any other country in the world, there's been a surge in local opposition to new data center projects that is often driven by concerns about water and power usage. Even with the higher projection for water use, Ren says de Vries-Gao's analysis is "really conservative" because it only captures the environmental effects of operating AI equipment -- excluding the additional effects that accumulate along the supply chain and at the end of a device's life. There's a pretty wide range of outcomes because companies are failing to disclose more accurate data. De Vries-Gao gathered whatever information he could from sustainability reports, but found that they often exclude key details, like their indirect water consumption from electricity demand and how much is used for AI specifically. Emissions and water consumption can vary depending on where a data center is located and how dirty the local power grid is in there, so being more forthcoming about where they operate or plan to build new data centers would also shine a greater light on AI's growing environmental impact. "We can really ask ourselves, is this how we want it to be? Is this fair?" de Vries-Gao says. "We really need to have that transparency, so we can start having that discussion."
[3]
AI surpasses 2024 Bitcoin mining in energy usage, uses more H20 than the bottles of water people drink globally, study claims -- says AI demand could hit 23GW and up to 764 billion liters of water in 2025
A study by Alex de Vries-Gao from the VU Amsterdam Institute for Environmental Studies indicates that global AI power demand could hit 23GW, while also consuming 312.5 to 764.6 billion liters of water this year. According to The Verge, this is greater than the amount of power Bitcoin mining used in all of 2024, while consuming about the same volume that the entire world consumes in bottled water in a year. Although these numbers might seem massive, these are still conservative estimates, especially as big tech companies do not break down the numbers in their annual sustainability reports to show the actual consumption of their AI operations. "There's no way to put an extremely accurate number on this, but it's going to be really big regardless," Alex de Vries-Gao told The Verge. "In the end, everyone is paying the price for this." So, to get these values, De Vries-Gao used estimates from analysts and matched them with data from earnings calls and other publicly available information to extrapolate how much AI hardware these companies have deployed, as well as their energy usage. With that value, he then estimated that these AI systems produced between 32.6 and 79.7 million tons of carbon dioxide annually, or about 56 million tons on average -- by comparison, Singapore's greenhouse gas emissions sit at just 53 million tons in 2022. The number of resources that AI data centers use has got some U.S. lawmakers concerned about its impact on the average American, with Senator Elizabeth Warren, alongside two other Democrat senators, sending a letter to seven big tech companies asking them to explain their energy consumption. Senator Bernie Sanders is taking it even further, proposing a complete moratorium on all AI data center construction to ensure that the technology benefits "all of us, not just the 1%". Nevertheless, President Donald Trump is still pushing for AI development to ensure that the U.S. remains at the forefront of this burgeoning technology, even going as far as comparing his "Genesis Mission" to the Manhattan Project of World War 2. Professor Shaolei Ren of the University of California, Riverside, even said to the publication that De Vries-Gao's numbers could be understated, especially as it only considers the actual operations. These assumptions could substantially increase once we consider the environmental impact of the entire supply chain -- from mining to fabrication, deployment, and the eventual disposal of the billions of AI chips being used in today's race.
[4]
2025's AI boom caused huge CO2 emissions and use of water, research finds
Study's author says society not tech companies paying for environmental impact of AI and asks if this is fair The AI boom has caused as much carbon dioxide to be released into the atmosphere in 2025 as emitted by the whole of New York City, it has been claimed. The global environmental impact of the rapidly spreading technology has been estimated in research published on Wednesdaywhich also found that AI-related water use now exceeds the entirety of global bottled-water demand. The figures have been compiled by the Dutch academic Alex de Vries-Gao, the founder of Digiconomist, a company that researches the unintended consequences of digital trends. He claimed they are the first attempt to measure the specific effect of artificial intelligence rather than datacentres in general as the use of chatbots such as OpenAI's ChatGPT and Google's Gemini soared in 2025. The figures show the estimated greenhouse gas emissions from AI use are also now equivalent to more than 8% of global aviation emissions. His study used technology companies' own reporting and he called for stricter requirements for them to be more transparent about their climate impact. "The environmental cost of this is pretty huge in absolute terms," he said. "At the moment society is paying for these costs, not the tech companies. The question is: is that fair? If they are reaping the benefits of this technology, why should they not be paying some of the costs?" De Vries-Gao found that the 2025 carbon footprint of AI systems could be as high as 80m tonnes, while the water used could reach 765bn litres. He said it was the first time AI's water impact had been estimated and showed that AI water use alone was more than a third higher than previous estimates of all datacentre water use. The figures are published in the academic journal Patterns. The International Energy Agency (IEA) said earlier this year that AI-focused datacentres draw as much electricity as power-thirsty aluminium smelters and datacentre electricity consumption is expected to more than double by 2030. "This is yet more evidence that the public is footing the environmental bill for some of the richest companies on Earth," said Donald Campbell, the director of advocacy at Foxglove, a UK non-profit that campaigns for fairness in tech. "Worse, it is likely just the tip of the iceberg. The datacentre construction frenzy, driven by generative AI, is only getting started. "Just one of these new 'hyperscale' facilities can generate climate emissions equivalent to several international airports. And in the UK alone, there are an estimated 100-200 of them in the planning system," said Campbell. The IEA has reported that the largest AI-focused datacentres being built today will each consume as much electricity as 2m households with the US accounting for the largest share of datacentre electricity consumption (45%) followed by China (25%) and Europe (15%). The largest datacentre being planned in the UK, at a former coal power station site in Blyth, Northumberland, is expected to emit more than 180,000 tonnes of CO2 a year when at full operation - the equivalent to the amount produced by more than 24,000 homes. In India, where $30bn (£22.5bn) is being invested in datacentres, there are growing concerns that a lack of reliability from the National Grid will mean the construction of huge diesel generator farms for backup power, which the consultancy KPMG this week called "a massive ... carbon liability". Technology companies' environmental disclosures are often insufficient to assess even the total datacentre impact, never mind isolating AI use, said De Vries-Gao. He noted that when Google recently reported on the impact of its Gemini AI, it did not account for the water used in generating the electricity needed to power it. Google reported that in 2024 it managed to reduce energy emissions from its datacentres by 12% due to new clean energy sources, but it said this summer that achieving its climate goals was "now more complex and challenging across every level - from local to global" and "a key challenge is the slower-than-needed deployment of carbon-free energy technologies at scale".
[5]
How Much Water Does AI Actually Use? Depends Who You Ask.
Yes, every question you ask AI uses up water -- and many are worried. A recent University of Chicago survey revealed that 4 in 10 U.S. adults are "extremely" worried about artificial intelligence's (AI) environmental impact -- and the Associated Press reports that many U.S. citizens oppose data centers entering their community. Much of the environmental concerns center around AI's water usage. But scoping out the scale of its use can be difficult. According to a Morgan Stanley report, AI data centers' global annual water consumption is set to reach 1,068 billion liters by 2028, an estimate 11 times higher than last year's projection.For context, each individual American uses roughly 243,174 liters (64,240 gallons) a year. Yet, earlier this year OpenAI CEO Sam Altman claimed each ChatGPT query only uses "about 0.000085 gallons of water; roughly one fifteenth of a teaspoon." And while each claim paints a vastly different picture, neither of them are technically wrong, as science YouTuber Hank Green explained in a recent video. The answer behind the differences boils down to when we start running the meter on AI's water use. "The actual lie is that it's only counting the water used during the part of the life cycle when you're actually querying the chatbot, not all of the other parts of the process that are necessary for this system to exist," Green explains AI queries are processed inside data centers, massive buildings holding upwards of thousands hundreds of racks of high-powered processing chips -- GPUs. It is here that AI models run their calculations and return responses to users via the cloud. All that electricity carrying data generates a lot of heat. Without a cooling element, the GPUs will overheat and fail. Cooling a data center is often done through either evaporative or a closed loop cooling system, both involving water. "The GPUs and the AI servers, they generate orders of magnitude more heat than the traditional [data centers], so you cannot use traditional data center air based cooling, like air conditioning," senior partner at McKinsey Pankaj Sachdeva says, adding that liquid is simply more effective than air at heat exchange. "Getting heat off the chip and getting heat out of the building are two different engineering problems with two different supply chains that happen to meet in the middle," Brandon Daniels, CEO Exiger, of one of the largest AI supply chain technology providers in the US, says. "The problem today is that cooling these new servers requires intricate systems that need expensive fluids, tons of good quality water, heavy filtration, and a massive amount of power." Traditional evaporative cooling towers are extremely power-efficient, but they do it by evaporating very large volumes of water, Daniels explains. On the other hand, he says, dry coolers and air cooled chillers use almost no water, but they have to work harder -- so the electrical footprint increases. For server cooling, closed-loop, direct-to-chip cooling solutions are vital to keeping the GPUs at the optimal temperature. "We're moving toward rear-door heat exchangers, Direct-to-Chip (D2C) cold plates that bring coolant right to the processor package, and immersion systems where entire servers live in dielectric fluids," says Daniels. "That's not a luxury upgrade anymore. For leading edge AI clusters, liquid is the precondition for the densities everyone is trying to build." But as industry giants like Nvidia promise to deliver higher processing units, their deployment will require more advanced cooling solutions. "Some of the largest, most technologically advanced companies in the world are involved in trying to solve this problem. So there's a lot of horsepower behind it," says the SVP of Data Center Business Development at TierPoint Don Schuett. Tierpoint manages over 30 data centers across the country. "That gives me some optimism that they continue to evolve solutions that allow us to keep up with Nvidia." Chris Stokel-Walker wrote about many of the companies trying to solve this problem in an article for Inc. Some estimates say that two liters of water might be needed for every kilowatt hour of energy, MIT Computing and Climate fellow Noman Bashir told MIT News. Though Altman says a fraction of a teaspoon of water is used per AI query, Green suggests that a more honest accounting of water consumption would include the water used in the process of creating the AI itself -- which is how Morgan Stanley arrived at its estimated number of 1,068 billion annual liters consumed by 2028. The report outlines the technology's water footprint as coming from not only the on-site data center cooling, but the electricity generation and semiconductor manufacturing needed to create the GPUs. By tracing the water consumption to every part necessary to answer a query, the numbers quickly add up. While the numbers are contradicting and confusing at times, even the most liberal of water use estimates look minor in comparison to other industries. "Even under the maximalist goals of AI companies the projected increase of water use is small compared to what cities and industries already use," Green said. The Morgan Stanley report agrees, with it noting that "while the increase in AI's water consumption is expected to be substantial, the absolute amount remains modest compared with traditional global water withdrawals across major sectors." For instance, Green highlights the environmental and water impact of the existing agricultural systems in the US, particularly resource intensive industries like agriculture. In one instance local outlet the Red Canary Magazine in Arizona estimated that in Maricopa County, upwards of 177 million gallons of water are used to cool data centers daily. Yet that would only account for 30 percent of what is used by the agricultural industry. Still, it is pertinent to monitor the water consumption and energy use of the growing technology, although AI companies like OpenAI are making it harder for others to develop solutions. "OpenAI doesn't share this information, which is part of why it is so easy to get numbers that are both fairly correct and very different from each other," Green said. And part of why it's so easy to lie about this from either direction." The final deadline for the 2026 Inc. Regionals Awards is Friday, December 12, at 11:59 p.m. PT. Apply now.
Share
Share
Copy Link
A new study estimates AI could consume up to 765 billion liters of water in 2025, matching global bottled water consumption. But a lack of transparency from tech companies makes accurate measurement difficult, sparking debate between researchers and industry advocates over AI's true environmental impact.
AI water consumption has become a flashpoint in debates over the technology's environmental footprint. New research by Alex de Vries-Gao from the VU Amsterdam Institute for Environmental Studies estimates that AI systems could use between 312.5 and 764.6 billion liters of water in 2025
2
3
. This figure exceeds the volume of water people consume globally in bottles annually, marking a significant milestone in AI's resource usage. The study, published in the journal Patterns, represents the first comprehensive attempt to isolate AI's environmental impact from general data centers operations4
.
Source: Inc.
Data centers require massive amounts of water for cooling systems to prevent servers from overheating. Water is also consumed by power plants that generate electricity for these facilities, accounting for the majority of a data center's water footprint
2
. The AI electricity and water consumption extends beyond direct operational needs to include semiconductor manufacturing and electricity generation throughout the supply chain5
.
Source: Tom's Hardware
The AI environmental impact extends to CO2 emissions and use of water on a scale comparable to major cities. De Vries-Gao's research indicates that AI could generate between 32.6 and 79.7 million tons of carbon pollution annually, roughly equivalent to New York City's entire carbon footprint of 50 million tons
2
4
. These greenhouse gas emissions now represent more than 8% of global aviation emissions4
.Power demand for AI globally could reach 23GW this year, surpassing the energy consumed by Bitcoin mining in 2024
2
3
. Across the US, which hosts more data centers than any other country, local opposition to new projects has surged, driven largely by concerns about water and power usage2
. More than 230 environmental groups sent a letter to Congress warning that AI threatens Americans' water security1
.A critical challenge in assessing AI's true environmental costs stems from the lack of transparency from tech companies. While major tech firms publish annual sustainability reports, they typically don't break down figures to show how much AI specifically consumes
2
. Google, OpenAI, and other companies often exclude key details like indirect water consumption from electricity demand2
4
.
Source: Wired
This opacity has sparked competing narratives. OpenAI CEO Sam Altman claimed each ChatGPT query uses only about 0.000085 gallons of water
5
, while Morgan Stanley projects data centers' global annual water consumption will reach 1,068 billion liters by 20285
. The discrepancy depends on whether calculations include only direct operational water use or account for the entire supply chain, including semiconductor manufacturing and electricity generation5
.Related Stories
Data center cooling represents a major driver of AI energy usage. GPUs and AI servers generate orders of magnitude more heat than traditional data centers, making air-based cooling insufficient
5
. Traditional evaporative cooling towers consume large volumes of water, while dry coolers use less water but require more electrical power5
.MIT estimates that two liters of water might be needed for every kilowatt hour of energy
5
. As Nvidia promises higher-performance processing units, deployment will require increasingly advanced cooling solutions. Companies are developing direct-to-chip cold plates and immersion systems where servers operate in dielectric fluids5
.De Vries-Gao's figures are considered conservative because they capture only operational impacts, excluding environmental costs across the supply chain and at end-of-life disposal
2
3
. Professor Shaolei Ren noted the projections exceed his 2023 study that estimated 600 billion liters by 20272
.The environmental costs are currently borne by society rather than tech companies. "At the moment society is paying for these costs, not the tech companies," de Vries-Gao stated
4
. A University of Chicago survey found 4 in 10 U.S. adults are extremely worried about AI's environmental impact5
. Meanwhile, some advocates argue concerns are overstated, with one Substack post titled "The AI Water Issue Is Fake" gaining traction among tech commentators1
. Greater transparency in sustainability reports will prove essential for informed public discourse about AI's true environmental footprint.Summarized by
Navi
[2]
1
Technology

2
Technology

3
Technology
