3 Sources
[1]
AI is gobbling up water it cannot replace - I'm working on a solution
Data centres are the invisible engines of our digital world. Every Google search, Netflix stream, cloud-stored photo or ChatGPT response passes through banks of high-powered computers housed in giant facilities scattered across the globe. These datacentres consume a staggering amount of electricity and increasingly, a surprising amount of water. But unlike the water you use at home, much of the water used in datacentres never returns to the water reuse cycle. This silent drain is drawing concern from environmental scientists. One preprint study (not yet reviewed by other scientists) from 2023 predicted that by 2027 global AI use could consume more water in a year than half of that used by the UK in the same time. Datacentres typically contain thousands of servers, stacked and running 24/7. These machines generate immense heat, and if not properly cooled, can overheat and fail. This happened in 2022 when the UK endured a heatwave that saw temperatures reach a record-breaking 40° Celsius in some areas, which knocked off Google and Oracle datacentres in London. To prevent this, datacentres rely heavily on cooling systems, and that's where water comes in. Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK's latest coverage of news and research, from politics and business to the arts and sciences. One of the most common methods for cooling datacentres involves mechanical chillers, which work like large fridges. These machines use a fluid called a refrigerant to carry heat away from the servers and release it through a condenser. A lot of water is lost as it turns into vapour during the cooling process, and it cannot be reused. A 1 megawatt (MW) datacentre (that uses enough electricity to power 1,000 houses) can use up to 25.5 million litres annually. The total data centre capacity in the UK is estimated at approximately 1.6 gigawatts (GW). The global data centre capacity stands at around 59 GW. Unlike water used in a dishwasher or a toilet, which often returns to a treatment facility to be recycled, the water in cooling systems literally vanishes into the air. It becomes water vapour and escapes into the atmosphere. This fundamental difference is why data centre water use is not comparable to that of typical household use, where water cycles back through municipal systems. As moisture in the atmosphere that can return to the land as rain, the water datacentres use remains part of Earth's water cycle - but not all rain water can be recovered. The water is effectively lost to the local water balance, which is especially critical in drought-prone or water-scarce regions - where two-thirds of datacentres since 2022 have been built. The slow return of this water makes its use for cooling datacentres effectively non-renewable in the short term. The rise of AI tools like ChatGPT, image generators and voice assistants has made datacentres work much harder. These systems need a lot more computing power, which creates more heat. To stay cool, data centres use more water than ever. This growing demand is leading to a greater reliance on water-intensive cooling systems, driving up total water consumption even further. The International Energy Agency reported in April 2025 that datacentres now consume more than 560 billion litres of water annually, possibly rising to 1,200 billion litres a year by 2030. What's the alternative? Another method, direct evaporative cooling, pulls hot air from datacentres and passes it through water-soaked pads. As the water evaporates, it cools the air, which is then sent back into server rooms. While this method is energy-efficient, especially in warmer climates, the added moisture in the air can damage sensitive server equipment. This method requires additional systems to manage and control humidity, which necessitates more complex datacentre design. My research team and I have developed another method which separates moist and dry air streams in datacentres with a thin aluminium foil, similar to kitchen foil. The hot, dry air passes close to the wet air stream, and heat is transferred through the foil without allowing any moisture to mix. This cools the server rooms in datacentres without adding humidity that could interfere with the equipment. Trials of this method at Northumbria University's datacentre have shown it can be more energy-efficient than conventional chillers, and use less water. Powered entirely by solar energy, the system operates without compressors or chemical refrigerants. As AI continues to expand, the demand on datacentres is expected to skyrocket, along with their water use. We need a global shift in how we design, regulate and power digital infrastructure. Don't have time to read about climate change as much as you'd like? Get a weekly roundup in your inbox instead. Every Wednesday, The Conversation's environment editor writes Imagine, a short email that goes a little deeper into just one climate issue. Join the 45,000+ readers who've subscribed so far.
[2]
AI boom means regulator cannot predict future water shortages in England
Datacentres, which do not have to report amount of water used to cool servers, leave Environment Agency with no idea of shortfalls The artificial intelligence boom means the Environment Agency has no idea how much water England will be short of in future decades, as datacentres do not have to report how much they are using to cool their servers. England's public water supply could be short by 5bn litres a day by 2055 without urgent action to future-proof resources, the government environment regulator has warned, with a shortfall of a further 1bn litres a day for farming, energy generation and powering emerging technologies. However, EA sources told the Guardian that this figure of 1bn for industry did not include the amounts of water to be used by datacentres, because that figure was unknown. This means the shortage could be much higher, as datacentres often use vast amounts of water. Every five years, the EA puts out its water deficit projections, but it was difficult to do this year, the sources said, because of the growth in AI, which is one of the most significant changes to projected usage in recent years. The regulator added that the majority of datacentres were using the public water supply rather than alternative sources and that they did not want this to stop or transparency over their figures. At the moment, the EA does not have sufficient data to be able to understand both these centres current use and their needs. Datacentres for AI are a pivotal part of the government's growth strategy, and Keir Starmer announced this year that he would hugely increase AI capacity and reduce planning restrictions on companies that wanted to build datacentres by setting up "growth zones" with fewer constraints. AI datacentres use a large amount of water to prevent their servers overheating and shutting down. The centres have cooling towers and outside air systems, both of which need clean, fresh water. AI consumes between 1.8 and 12 litres of water for each kilowatt hour of energy usage across Microsoft's global datacentres. One study estimates that global AI could account for up to 6.6bn cubic metres of water use by 2027 - the equivalent of nearly two-thirds of England's annual consumption. The EA chair, Alan Lovell, said: "The nation's water resources are under huge and steadily increasing pressure. This deficit threatens not only the water from your tap but also economic growth and food production. "Taking water unsustainably from the environment will have a disastrous impact on our rivers and wildlife. We need to tackle these challenges head-on and strengthen work on coordinated action to preserve this precious resource and our current way of life." Plans to increase supply, submitted by water companies, include nine desalination schemes, 10 reservoirs and seven water recycling schemes by 2050. Water bills for customers have risen, and will continue to rise, to pay for this infrastructure. The government also plans to monitor individual household water use by rolling out smart meters, which charge based on the amount used and allow water companies and other agencies to track usage. Climate breakdown will further squeeze water supplies, the EA said, as hotter drier summers become more probable. Areas that rely mostly on surface water will therefore be more susceptible to drought, and it may not rain consistently enough for groundwater to recharge. Thames Water on Tuesday opened a statutory public consultation on a controversial drought scheme to pump millions more litres of treated sewage into the River Thames every day. Under the £300m project, Thames would extract 75m litres of water a day from the river in south-west London during drought and replacing it with treated sewage from one of Europe's largest treatment plants at Mogden. The plan has drawn objections from thousands of people, including the Liberal Democrat MP Munira Wilson, over concerns about the effect on river water quality, the impact of forever chemicals, and the adverse effect on the ecology. The EA has said Thames Water has failed to show that the scheme is "feasible or environmentally acceptable". Thames Water loses about 570m litres a day from leaks on its network, the most of all the privatised water companies. Ofwat's chief executive, David Black, said: "Boosting supply through building critical water infrastructure is essential to safeguard supplies of drinking water. The way is now clear for the water industry to build on the success of the recently opened £5bn Thames Tideway project by stepping forward to deliver an expanded pipeline of 30 major projects we need in England and Wales."
[3]
AI is gobbling up water it cannot replace. I'm working on a solution
Data centers are the invisible engines of our digital world. Every Google search, Netflix stream, cloud-stored photo or ChatGPT response passes through banks of high-powered computers housed in giant facilities scattered across the globe. These data centers consume a staggering amount of electricity and increasingly, a surprising amount of water. But unlike the water you use at home, much of the water used in data centers never returns to the water reuse cycle. This silent drain is drawing concern from environmental scientists. One preprint study (not yet reviewed by other scientists) from 2023 predicted that by 2027 global AI use could consume more water in a year than half of that used by the UK in the same time. Data centers typically contain thousands of servers, stacked and running 24/7. These machines generate immense heat, and if not properly cooled, can overheat and fail. This happened in 2022 when the UK endured a heat wave that saw temperatures reach a record-breaking 40° Celsius in some areas, which knocked off Google and Oracle data centers in London. To prevent this, data centers rely heavily on cooling systems, and that's where water comes in. One of the most common methods for cooling data centers involves mechanical chillers, which work like large fridges. These machines use a fluid called a refrigerant to carry heat away from the servers and release it through a condenser. A lot of water is lost as it turns into vapor during the cooling process, and it cannot be reused. A 1 megawatt (MW) data center (that uses enough electricity to power 1,000 houses) can use up to 25.5 million liters annually. The total data center capacity in the UK is estimated at approximately 1.6 gigawatts (GW). The global data center capacity stands at around 59 GW. Unlike water used in a dishwasher or a toilet, which often returns to a treatment facility to be recycled, the water in cooling systems literally vanishes into the air. It becomes water vapor and escapes into the atmosphere. This fundamental difference is why data center water use is not comparable to that of typical household use, where water cycles back through municipal systems. As moisture in the atmosphere that can return to the land as rain, the water data centers use remains part of Earth's water cycle - but not all rain water can be recovered. The water is effectively lost to the local water balance, which is especially critical in drought-prone or water-scarce regions - where two-thirds of data centers since 2022 have been built. The slow return of this water makes its use for cooling data centers effectively non-renewable in the short term. The rise of AI tools like ChatGPT, image generators and voice assistants has made data centers work much harder. These systems need a lot more computing power, which creates more heat. To stay cool, data centers use more water than ever. This growing demand is leading to a greater reliance on water-intensive cooling systems, driving up total water consumption even further. The International Energy Agency reported in April 2025 that data centers now consume more than 560 billion liters of water annually, possibly rising to 1,200 billion liters a year by 2030. What's the alternative? Another method, direct evaporative cooling, pulls hot air from data centers and passes it through water-soaked pads. As the water evaporates, it cools the air, which is then sent back into server rooms. While this method is energy-efficient, especially in warmer climates, the added moisture in the air can damage sensitive server equipment. This method requires additional systems to manage and control humidity, which necessitates more complex data center design. My research team and I have developed another method which separates moist and dry air streams in data centers with a thin aluminum foil, similar to kitchen foil. The hot, dry air passes close to the wet air stream, and heat is transferred through the foil without allowing any moisture to mix. This cools the server rooms in data centers without adding humidity that could interfere with the equipment. Trials of this method at Northumbria University's data center have shown it can be more energy-efficient than conventional chillers, and use less water. Powered entirely by solar energy, the system operates without compressors or chemical refrigerants. As AI continues to expand, the demand on data centers is expected to skyrocket, along with their water use. We need a global shift in how we design, regulate and power digital infrastructure.
Share
Copy Link
The rapid growth of AI and data centers is causing an unprecedented strain on global water resources, with cooling systems consuming vast amounts of water that cannot be easily replaced. This trend is raising concerns about future water shortages and environmental impact.
The rapid expansion of artificial intelligence (AI) and data centers is creating an unprecedented demand for water resources globally. Data centers, which power our digital world, are consuming staggering amounts of water for cooling purposes, raising concerns among environmental scientists and regulators 1.
Source: The Conversation
A 2023 preprint study predicted that by 2027, global AI use could consume more water in a year than half of that used by the UK in the same period 1. The International Energy Agency reported that data centers now consume more than 560 billion liters of water annually, with projections suggesting this could rise to 1,200 billion liters a year by 2030 1.
The water used in data center cooling systems differs significantly from household water use. Unlike water used in homes, which often returns to treatment facilities for recycling, the water in data center cooling systems evaporates into the atmosphere 1. This process effectively removes water from local water cycles, making it a non-renewable resource in the short term.
In England, the Environment Agency has warned that the public water supply could be short by 5 billion liters a day by 2055 without urgent action 2. However, this projection does not include the water usage of data centers, as they are not required to report their consumption. This lack of data makes it challenging for regulators to accurately predict future water shortages 2.
Source: Tech Xplore
Data centers typically use mechanical chillers for cooling, which work like large refrigerators. These systems use a refrigerant to carry heat away from servers and release it through a condenser. During this process, a significant amount of water is lost as vapor 1.
A 1 megawatt (MW) data center can use up to 25.5 million liters of water annually. With the global data center capacity estimated at around 59 gigawatts (GW), the scale of water consumption becomes apparent 3.
Researchers are working on alternative cooling methods to address this growing concern. One such method, developed by a team at Northumbria University, uses thin aluminum foil to separate moist and dry air streams in data centers. This approach allows for efficient cooling without adding humidity that could damage sensitive equipment 1.
As AI continues to expand, the demand on data centers is expected to increase dramatically. This trend necessitates a global shift in how we design, regulate, and power digital infrastructure to ensure sustainable water use 3.
The AI boom presents a significant challenge for water resource management. Without proper regulation and innovative solutions, the growing water consumption of data centers could exacerbate water scarcity issues in many regions. As we move forward, balancing technological advancement with environmental sustainability will be crucial for the future of AI and global water resources.
Summarized by
Navi
[1]
Apple's senior VP of Hardware Technologies, Johny Srouji, reveals the company's interest in using generative AI to accelerate chip design processes, potentially revolutionizing their approach to custom silicon development.
11 Sources
Technology
20 hrs ago
11 Sources
Technology
20 hrs ago
A new study reveals that AI reasoning models produce significantly higher CO₂ emissions compared to concise models when answering questions, highlighting the environmental impact of advanced AI technologies.
8 Sources
Technology
12 hrs ago
8 Sources
Technology
12 hrs ago
Meta is reportedly in discussions to bring on former GitHub CEO Nat Friedman and AI investor Daniel Gross to bolster its artificial intelligence efforts, potentially including a partial buyout of their venture fund NFDG.
7 Sources
Business and Economy
20 hrs ago
7 Sources
Business and Economy
20 hrs ago
OpenAI executives anticipate that upcoming AI models will pose a higher risk for potential misuse in bioweapons development, prompting increased safety measures and industry-wide concerns.
2 Sources
Technology
12 hrs ago
2 Sources
Technology
12 hrs ago
European drone manufacturers are flocking to Ukraine, using the ongoing conflict as a real-world laboratory to test and improve their technologies, with implications for both military and civilian applications.
4 Sources
Technology
12 hrs ago
4 Sources
Technology
12 hrs ago