The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Fri, 20 Sept, 12:04 AM UTC
3 Sources
[1]
ChatGPT energy emergency -- here's how much electricity OpenAI and others are sucking up per week
A pair of reports this week highlight the growing environmental costs of using generative AI and training LLMs. And the data is shocking -- showing a single query on ChatGPT-4 can use up to 3 bottles of water, that a year of queries uses enough electricity to power over nine houses, and that this sustainability problem could get 10x worse by 2030. Much of the cost comes due to growing demand for data centers that power AI models. Once the centers are built or already exist, a new set of issues crop up surrounding their internal and external affects. This week, data from the University of California, shared with the Washington Post and spotted by our friends at Tom's Hardware, showed that AI data centers require very heavy water consumption to cool the servers. The paper does note that water usage depends on the state and proximity to water. Lower water usage corresponds to cheaper electricity and higher use. Texas, for example uses an estimated 235 mililiters, about one cup, to generate a 100-word email, while the state of Washington requires 1,408 mililiters for the same email which equates to three 16 oz water bottles. That may not be a lot but ChatGPT isn't being used once for one 100 word email. Plus, the OpenAI-built chatbot takes a lot of power to run, and again, this is just for plain text. The report does not appear to go into detail about image or video generation. As an example from an EPRI report, a regular Google search uses around 0.3 Wh of electricity per query. Meanwhile, ChatGPT requires about 2.9 Wh. That means if one out of 10 working Americans use GPT-4 once a week for a year (so, 52 queries total by 17 million people), the corresponding power demands of 121,517 MWh would be equal to the electricity consumed by every single household in Washington D.C. (an estimated 671,803 people) for twenty days. On top of that, 105 million gallons of water would be used just to keep the servers cool. The second major report to come out earlier this week just reinforces the issue. Bloomberg reported the drive to insert AI into every conceivable facet of the tech world has driven a boom in utility companies across the United States to build new gas-fired power generation facilities to meet surging demand for power. Much of the demand appears to come from three sources: AI data centers, manufacturing facilities and electric vehicles. The article doesn't provide a breakdown of how much demand each of those three vectors is actually putting on our systems, but data center demand is expected to grow by up to 10x current levels by 2030, which would be 9% of the United States' total electricity generation, according to the Electric Power Research Institute. Utilities across the country have announced nearly double the amount above new plants are being built to combat the demand, with a majority coming to Texas. "We were poised to shift away from the energy system of the past, from costly and polluting infrastructure like coal and gas plants. But now we're going in the opposite direction," Kendl Kobbervig, advocacy director at Clean Virginia told Bloomberg. "A lot of people are feeling whiplash." As Bloomberg notes, with the recent fracking resurgence, natural gas has reduced the environmental impact of coal as we move away from that fuel source. However, gas plants have the tendency to leak methane gas which has "80 times the planet-warming impact of carbon dioxide in its first 20 years in the atmosphere." Additionally, once a gas plant is built, they don't go away and are likely to run for a minimum of 40 years. It also means that utilities that had committed to reducing carbon emissions have had to cut plans down. An example from Bloomberg, PacifiCorp was expected to reduce emissions by 78% by 2030. They have revised that estimate to 63% with the announcements of new plants. Tom's Guide AI expert Ryan Morrison said of the new reports: "While it is true that there is a significant environmental impact from AI, largely driven by the massive compute requirements, it will become less of a problem as models get more efficient and custom chips take over from power-hungry GPUs. "AI may well also be the solution to its own energy problem, as well as the wider energy problems facing society. As the technology improves it will be used to design more efficient compute and cooling systems that minimise the impact on the environment. The designing the path to net zero may require AI." AI is power-hungry and these new data centers aren't just impacting the environment and climate change. The Los Angeles Times reported in August that in Santa Clara, data centers consume 60% of the city's electricity. This appetite runs the risk of increasing blackouts due to lack of power. From there its also raising the water and power bills of people who live in those communities. Companies like PG&E say that customers' bills won't raise but it's clear that they're passing infrastructure cost on to customers, which data centers are clearly not paying their fair share. It seems to fly in the face of claims by major AI companies like OpenAI, Google, Meta and Microsoft saying that they are committed to reducing their environmental impact. A Microsoft rep told the Post, that the company is "working toward data center cooling methods that will eliminate water consumption completely." AI has become the pie in the sky for many of these companies and its taking precedence over all else, including their impacts on their communities and the environment.
[2]
ChatGPT's resource demands are getting out of control | Digital Trends
It's no secret that the growth of generative AI has demanded ever increasing amounts of water and electricity, but a new study from The Washington Post and researchers from University of California, Riverside shows just how many resources OpenAI's chatbot needs in order to perform even its most basic functions. In terms of water usage, the amount needed for ChatGPT to write a 100-word email depends on the state and the users proximity to OpenAI's nearest data center. The less prevalent water is in a given region, and the less expensive electricity is, the more likely the data center is to rely on electrically powered air conditioning units instead. In Texas, for example, the chatbot only consumes an estimated 235 milliliters needed to generate one 100-word email. That same email drafted in Washington, on the other hand, would require 1,408 milliliters (nearly a liter and a half) per email. Recommended Videos Data centers have grown larger and more densely packed with the rise of generative AI technology, to the point that air-based cooling systems struggle to keep up. This is why many AI data centers have switched over to liquid-cooling schemes that pump huge amounts of water past the server stacks, to draw off thermal energy, and then out to a cooling tower where the collected heat dissipates. ChatGPT's electrical requirements are nothing to sneeze at either. According to The Washington Post, using ChatGPT to write that 100-word email draws enough current to operate more than a dozen LED lightbulbs for an hour. If even one-tenth of Americans used ChatGPT to write that email once a week for a year, the process would use the same amount of power that every single Washington, D.C., household does in 20 days. D.C. is home to roughly 670,000 people. This is not an issue that will be resolved any time soon, and will likely get much worse before it gets better. Meta, for example, needed 22 million liters of water to train its latest Llama 3.1 models. Google's data centers in The Dalles, Oregon, were found to consume nearly a quarter of all the water available in the town, according to court records, while xAI's new Memphis supercluster is already demanding 150MW of electricity -- enough to power as many as 30,000 homes -- from the the local utility, Memphis Light, Gas and Water.
[3]
Using GPT-4 to generate 100 words consumes up to 3 bottles of water -- AI data centers also raise power and water bills for nearby residents
Research conducted by the University of California, Riverside shared by The Washington Post on Wednesday highlighted the high costs of using generative AI. It turns out AI requires pretty heavy water consumption -- used to cool the servers that generate the data -- even when it's just generating text. This is, of course, in addition to the severe electric grid toll. The research noted that the exact water usage varies depending on state and proximity to data center, with lower water use corresponding to cheaper electricity and higher electricity use. Texas had the lowest water usage at an estimated 235 milliliters needed to generate one 100-word email, while Washington demanded a whopping 1,408 milliliters per email -- which is about three 16.9oz water bottles. This may not sound like a lot, but remember that these figures add up fairly quickly, especially when users are using GPT-4 multiple times a week (or multiple times a day) -- and this is just for plain text. Data centers are shown to be heavy consumers of water and electricity, which also drives up the power and water bills of residents in the towns where these data centers are being built. For example, Meta needed to use 22 million liters of water to train its LLaMA-3 model -- about how much water is needed to grow 4,439 pounds of rice, or, as researchers noted, "about what 164 Americans consume in a year." The electric cost of GPT-4 is also quite high. If one out of 10 working Americans use GPT-4 once a week for a year (so, 52 queries total by 17 million people), the corresponding power demands of 121,517 MWh would be equal to the electricity consumed by every single household in Washington D.C. (an estimated 671,803 people) for twenty days. That's nothing to scoff at, especially since it's an unrealistically light use case for GPT-4's target audience. The Washington Post included quotes from OpenAI, Meta, Google, and Microsoft representatives, most of which reaffirmed commitment to reducing environmental demand rather than giving actual courses of action. Microsoft rep Craig Cincotta stated that the company will be "working toward data center cooling methods that will eliminate water consumption completely," which sounds nice but is vague on how. The AI profit motive has clearly taken priority over environmental goals set by these companies in the past, so even this quote should be taken with a grain of salt until we see actual improvements.
Share
Share
Copy Link
As AI technology advances, concerns grow over its environmental impact. ChatGPT and other AI models are consuming enormous amounts of energy and water, raising questions about sustainability and resource management in the tech industry.
As artificial intelligence (AI) continues to evolve and integrate into our daily lives, concerns are growing about the environmental impact of these powerful technologies. Recent reports have shed light on the staggering energy consumption of AI models like ChatGPT, prompting discussions about sustainability in the tech industry.
OpenAI's ChatGPT, one of the most popular AI language models, is reported to consume a massive amount of electricity. According to estimates, ChatGPT uses approximately 564 MWh of electricity per day, which translates to about 3,948 MWh per week 1. To put this into perspective, this is equivalent to the weekly electricity consumption of around 175 average American homes.
The energy consumption issue extends beyond ChatGPT to other AI models and data centers. These facilities not only demand significant amounts of electricity but also require vast quantities of water for cooling purposes. In fact, generating just 100 words using GPT-4 is estimated to consume up to three bottles of water 3.
The resource-intensive nature of AI operations is having tangible effects on local communities. Residents living near AI data centers are reportedly experiencing increases in their power and water bills 3. This raises concerns about the equitable distribution of resources and the potential for AI development to exacerbate existing inequalities.
To further illustrate the scale of AI's resource demands, consider that training a single AI model can consume more electricity than 100 US homes use in an entire year 2. This level of energy consumption has significant implications for carbon emissions and climate change, especially if the electricity is sourced from non-renewable energy.
As awareness of AI's environmental impact grows, there is increasing pressure on tech companies to address these concerns. Some companies are exploring more energy-efficient AI training methods and investing in renewable energy sources for their data centers. However, as AI technology continues to advance and become more widespread, finding sustainable solutions to manage its resource consumption remains a critical challenge for the industry.
Reference
[1]
[2]
Chinese startup DeepSeek claims to have created an AI model that matches the performance of established rivals at a fraction of the cost and carbon footprint. However, experts warn that increased efficiency might lead to higher overall energy consumption due to the Jevons paradox.
5 Sources
5 Sources
As generative AI technologies rapidly advance, concerns grow about their significant environmental impact, from energy consumption to e-waste generation. This story explores the challenges and potential solutions for sustainable AI development.
3 Sources
3 Sources
As generative AI usage surges, concerns about its ecological footprint are mounting. This story explores the environmental impact of AI in terms of energy consumption, water usage, and electronic waste.
2 Sources
2 Sources
A recent study by Epoch AI suggests that ChatGPT's energy consumption may be significantly lower than previously thought, potentially on par with a Google search.
2 Sources
2 Sources
The rapid advancement of artificial intelligence is driving unprecedented electricity demands, raising concerns about sustainability and the need for innovative solutions in the tech industry.
4 Sources
4 Sources