5 Sources
[1]
AI Is Eating Data Center Power Demand -- and It's Only Getting Worse
A new analysis of AI hardware being produced and how it is being used attempts to estimate the vast amount of electricity being consumed by AI. AI's energy use already represents as much as 20 percent of global data-center power demand, research published Thursday in the journal Joule shows. That demand from AI, the research states, could double by the end of this year, comprising nearly half of all total data-center electricity consumption worldwide, excluding the electricity used for bitcoin mining. The new research is published in a commentary by Alex de Vries-Gao, the founder of Digiconomist, a research company that evaluates the environmental impact of technology. De Vries-Gao started Digiconomist in the late 2010s to explore the impact of bitcoin mining, another extremely energy-intensive activity, would have on the environment. Looking at AI, he says, has grown more urgent over the past few years because of the widespread adoption of ChatGPT and other large language models that use massive amounts of energy. According to his research, worldwide AI energy demand is now set to surpass demand from bitcoin mining by the end of this year. "The money that bitcoin miners had to get to where they are today is peanuts compared to the money that Google and Microsoft and all these big tech companies are pouring in [to AI]," he says. "This is just escalating a lot faster, and it's a much bigger threat." The development of AI is already having an impact on Big Tech's climate goals. Tech giants have acknowledged in recent sustainability reports that AI is largely responsible for driving up their energy use. Google's greenhouse gas emissions, for instance, have increased 48 percent since 2019, complicating the company's goals of reaching net zero by 2030. "As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute," Google's 2024 sustainability report reads. Last month, the International Energy Agency released a report finding that data centers made up 1.5 percent of global energy use in 2024 -- around 415 terrawatt-hours, a little less than the yearly energy demand of Saudi Arabia. This number is only set to get bigger: Data centers' electricity consumption has grown four times faster than overall consumption in recent years, while the amount of investment in data centers has nearly doubled since 2022, driven largely by massive expansions to account for new AI capacity. Overall, the IEA predicted that data center electricity consumption will grow to more than 900 TWh by the end of the decade. But there's still a lot of unknowns about the share that AI, specifically, takes up in that current configuration of electricity use by data centers. Data centers power a variety of services -- like hosting cloud services and providing online infrastructure -- that aren't necessarily linked to the energy-intensive activities of AI. Tech companies, meanwhile, largely keep the energy expenditure of their software and hardware private. Some attempts to quantify AI's energy consumption have started from the user side: calculating the amount of electricity that goes into a single ChatGPT search, for instance. De Vries-Gao decided to look, instead, at the supply chain, starting from the production side to get a more global picture.
[2]
AI energy usage hard to measure, but MIT Tech Review tried
A single person with a serious AI habit may chew through enough electricity each day to keep a microwave running for more than three hours. And the actual toll may even be worse, as so many companies keep details about their AI models secret. Speaking to two dozen researchers tracking AI energy consumption, and conducting experiments on its own, the MIT Technology Review concluded that the total energy and climate toll is incredibly difficult to get a handle on. But that didn't stop them from trying. Working with researchers from Hugging Face, the authors of the report determined that the energy used for a single query to the open-source Llama 3.1 8B engine used around 57 joules of energy to generate a response. (That 8B means the engine uses 8 billion parameters). When accounting for cooling and other energy demands, the report said that number should be doubled, bringing a single query on that model to around 114 joules - equivalent to running a microwave for around a tenth of a second. A larger model, like Llama 3.1 405B, needs around 6,706 joules per response - eight seconds of microwave usage. In other words, the size of a particular model plays a huge role in how much energy it uses. Although its true size is a mystery, OpenAI's GPT-4 is estimated to have well over a trillion parameters, meaning its per-query energy footprint is likely far higher than the Llama queries tested. It's also worth pointing out that those figures are for text-based responses. AI-generated photos actually use considerably less than text responses thanks to their smaller model size and the fact that diffusion is more energy efficient than inference, MIT TR noted. AI video generation, on the other hand, is an energy sinkhole. In order to generate a five-second long video at 16 frames per second, the CogVideoX AI video generation model consumes a whopping 3.4 million joules of energy - equivalent to running a microwave for an hour or riding 38 miles on an e-bike, Hugging Face researchers told the Tech Review. "It's fair to say that the leading AI video generators, creating dazzling and hyperrealistic videos up to 30 seconds long, will use significantly more energy," the report noted. Using that data, the authors compiled an estimate to look at the daily AI energy consumption of someone with a habit of leaning on generative models for various tasks. Fifteen questions, ten attempts at generating an image, and three tries at making an Instagram-ready five-second video would eat up the aforementioned estimate of 2.9 kWh of electricity - three and a half hours of microwave usage. Hundreds of millions of people around the world use ChatGPT per week, OpenAI estimates. The researchers focused on open-source LLMs that we know a lot about. But companies like OpenAI and Google keep the size and reach of their models hidden from the public, and that that seriously hampers accurate energy usage estimates. When it comes to measuring CO2 emissions, the AI picture gets even more complicated, the Tech Review article notes. The mixture of renewable and non-renewable energy sources varies widely by location and time of day (solar isn't used at night, for instance). The report also didn't touch on prompt caching, a technique commonly used by generative models to store responses and feed them back to users asking the same or similar questions, which can reduce energy consumption for AI models. Regardless of those caveats, one thing is for sure: a lot of energy is being consumed to power the world's growing AI habit. Not only that, but a good portion of it is spewing carbon into the atmosphere for what is arguably questionable usefulness. As the Tech Review report pointed out, the current spike in datacenter energy usage follows years of relatively flat consumptions thanks to steady workloads and ever-increasing efficiency. Datacenters ate up more than a fifth of the electricity in Ireland in 2023. The global energy consumption of datacenters is predicted to more than double from its current rates by 2030, surpassing the energy consumption of the entire nation of Japan by the start of the next decade. AI, naturally, is the largest driver of that increase. There has been a lot of lip service paid to going green by tech companies over the years, who've long assured the public that their bit barns aren't an environmental threat. But now that AI's in the picture, the professed net-zero goals of tech giants like Microsoft and Google are receding into the distance. We've covered this a lot at The Register of late, and our reporting largely aligns with what the MIT tech Review report concluded: The energy behind AI is far dirtier than tech companies would like us to believe. Overall, datacenters are predicted to emit 2.5 billion tons of greenhouse gases by the end of the decade. That's three times more than they would have if generative AI hadn't become the latest craze. To add insult to apocalypse, those numbers rest on a shaky data foundation, as the Tech Review report noted. "This leaves even those whose job it is to predict energy demands forced to assemble a puzzle with countless missing pieces, making it nearly impossible to plan for AI's future impact on energy grids and emissions," they said. ®
[3]
AI could account for nearly half of data centre power usage 'by end of year'
Artificial intelligence systems forecast to require almost as much energy by end of decade as Japan uses today Artificial intelligence systems could approach accounting for half of data centre power consumption by the end of this year, according to new analysis. The estimates by Alex de Vries-Gao, the founder of the Digiconomist tech sustainability website, came as the International Energy Agency (IEA) forecast that AI would require almost as much energy by the end of this decade as Japan uses today. De Vries-Gao's calculations, to be published in the sustainable energy journal Joule, are based on the power consumed by chips made by Nvidia and Advanced Micro Devices that are used to train and operate AI models. The paper also takes into account the energy consumption of chips used by other companies such as Broadcom. The IEA estimates that all data centres - excluding mining for crypto currencies - consumed 415 terawatt hours (TWh) of electricity last year. De Vries-Gao argues in his research that AI could already account for 20% of that total. De Vries-Gao said a number of variables came into his calculations, such as the energy efficiency of a data centre and electricity consumption related to cooling systems for servers handling an AI system's busy workloads. Data centres are the central nervous system of AI technology, with their high energy demands making sustainability a key concern in the development and use of artificial intelligence systems. By the end of 2025, De Vries-Gao estimates, energy consumption by AI systems could approach up to 49% of total data centre power consumption, again excluding crypto mining. AI consumption could reach 23 gigawatts (GW), the research estimates, twice the total energy consumption of the Netherlands. However, De Vries-Gao said a number of factors could lead to a slowdown in hardware demand, such as waning demand for applications such as ChatGPT. Another issue could be geopolitical tensions resulting in constraints on producing AI hardware, such as export controls. De Vries-Gao cites the example of barriers on Chinese access to chips, which contributed to the release of the DeepSeek R1 AI model that reportedly used fewer chips. De Vries-Gao said: "These innovations can reduce the computational and energy costs of AI." But he said any efficiency gains could encourage even more AI use. Multiple countries attempting to build their own AI systems - a trend known as "sovereign AI" - could also increase hardware demand. De Vries-Gao also pointed to a US data centre startup, Crusoe Energy, securing 4.5GW of gas-powered energy capacity for its infrastructure, with the ChatGPT developer OpenAI among the potential customers through its Stargate joint venture. "There are early indications that these [Stargate] data centres could exacerbate dependence on fossil fuels," writes De Vries-Gao. On Thursday OpenAI announced the launch of a Stargate project in the United Arab Emirates, the first outside the US. Microsoft and Google admitted last year that their AI drives were endangering their ability to meet internal environmental targets. De Vries-Gao said information on AI's power demands had become increasingly scarce, with the analyst describing it as an "opaque industry". The EU AI Act requires AI companies to disclose the energy consumption behind training a model but not for its day-to-day use. Prof Adam Sobey, the mission director for sustainability at the UK's Alan Turing Institute, an AI research body, said more transparency was needed on how much energy is consumed by artificial intelligence systems - and how much they could potentially save by helping make carbon-emitting industries such as transport and energy more efficient. Sobey said: "I suspect that we don't need many very good use cases [of AI] to offset the energy being used on the front end."
[4]
Report: Creating a 5-second AI video is like running a microwave for an hour
You've probably heard that statistic that every search on ChatGPT uses the equivalent of a bottle of water. And while that's technically true, it misses some of the nuance. The MIT Technology Review dropped a massive report that reveals how the artificial intelligence industry uses energy -- and exactly how much energy it costs to use a service like ChatGPT. The report determined that the energy cost of large-language models like ChatGPT cost anywhere from 114 joules per response to 6,706 joules per response -- that's the difference between running a microwave for one-tenth of a second to running a microwave for eight seconds. The lower-energy models, according to the report, use less energy because they uses fewer parameters, which also means the answers tend to be less accurate. It makes sense, then, that AI-produced video takes a whole lot more energy. According to the MIT Technology Report's investigation, to create a five-second video, a newer AI model uses "about 3.4 million joules, more than 700 times the energy required to generate a high-quality image". That's the equivalent of running a microwave for over an hour. The researchers tallied up the amount of energy it would cost if someone, hypothetically, asked an AI chatbot 15 questions, asked for 10 images, and three five-second videos. The answer? Roughly 2.9 kilowatt-hours of electricity, which is the equivalent of running a microwave for over 3.5 hours. The investigation also examined the rising energy costs of the data centers that power the AI industry. The report found that prior to the advent of AI, the electricity usage of data centers was largely flat thanks to increased efficiency. However, due to energy-intensive AI technology, the energy consumed by data centers in the United States has doubled since 2017. And according to government data, half the electricity used by data centers will go toward powering AI tools by 2028. This report arrives at a time in which people are using generative AI for absolutely everything. Google announced at its annual I/O event that it's leaning into AI with fervor. Google Search, Gmail, Docs, and Meet are all seeing AI integrations. People are using AI to lead job interviews, create deepfakes of OnlyFans models, and cheat in college. And all of that, according to this in-depth new report, comes at a pretty high cost.
[5]
How Much Electricity It Actually Takes to Use AI May Surprise You
By now, most of us should be vaguely aware that artificial intelligence is hungry for power. Even if you don't know the exact numbers, the charge that "AI is bad for the environment" is well-documented, bubbling from sources ranging from mainstream press to pop-science YouTube channels to tech trade media. Still, the AI industry as we know it today is young. Though startups and big tech firms have been plugging away on large language models (LLMs) since the 2010s, the release of consumer generative AI in late 2022 brought about a huge increase in AI adoption, leading to an unprecedented "AI boom." In under three years, AI has come to dominate global tech spending in ways researchers are just starting to quantify. In 2024, for example, AI companies nabbed 45 percent of all US venture capital tech investments, up from only nine percent in 2022. Medium-term, big-name consultant firms like McKinsey expect AI infrastructure spending to grow to $6.7 trillion by 2030; compare this to just $450 billion in 2022. That being the case, research on AI's climate and environmental impacts can seem vague and scattered, as analysts race to establish concrete environmental trends in the extraordinary explosion of the AI industry. A new survey by MIT Technology Review is trying to change that. The authors spoke to two dozen AI experts working to uncover the tech's climate impact, combed "hundreds of pages" of data and reports, and probed the top developers of LLM tools in order to provide a "comprehensive look" at the industry's impact. "Ultimately, we found that the common understanding of AI's energy consumption is full of holes," the authors wrote. That led them to start small, looking at the energy use of a single LLM query. Beginning with text-based LLMs, they found that model size directly predicted energy demand, as bigger LLMs use more chips -- and therefore more energy -- to process questions. While smaller models like Meta's Llama 3.1 8B used roughly 57 joules per response (or 114 joules when the authors factored for cooling power and other energy needs), larger units needed 3,353 joules (or 6,706), or in MIT Tech's point of reference, enough to run a microwave for eight seconds. Image-generating AI models, like Stable Diffusion 3 Medium, needed 1,141 joules (or 2,282) on average to spit out a standard 1024 x 1024 pixel image -- the type that are rapidly strangling the internet. Doubling the quality of the image likewise doubles the energy use to 4,402 joules, worth over five seconds of microwave warming time, still less than the largest language bot. Video generation is where the sparks really start flying. The lowest-quality AI video software, a nine-month old version of Code Carbon, took an eye-watering 109,000 joules to spew out a low-quality, 8fps film -- "more like a GIF than a video," the authors noted. Better models use a lot more. With a recent update, that same tool takes 3.4 million joules to spit out a five-second, 16fps video, equivalent to running a microwave for over an hour. Whether any of those numbers amount to a lot or a little is open to debate. Running the microwave for a few seconds isn't much, but if everybody starts doing so hundreds of times a day -- or in the case of video, for hours at a time -- it'll make a huge impact in the world's power consumption. And of course, the AI industry is currently trending toward models that use more power, not less. Zooming out, the MIT Tech survey also highlights some concerning trends. One is the overall rise in power use correlating to the rise of AI. While data center power use remained mostly steady across the US between 2005 and 2017, their power consumption doubled by 2023, our first full year with mass-market AI. As of 2024, 4.4 percent of all energy consumed in the US went toward data centers. Meanwhile, data centers' carbon intensity -- the amount of iceberg-melting exhaust spewed relative to energy used -- became 48 percent higher than the US average. All that said, the MIT authors have a few caveats. First, we can't look under the hood at closed-source AI models like OpenAI's ChatGPT, and most of the leading AI titans have declined to join in on good-faith climate mapping initiatives like AI Energy Score. Until that changes, any attempt to map such a company's climate impact is a stab in the dark at best. In addition, the survey's writers note that data centers are not inherently bad for the environment. "If all data centers were hooked up to solar panels and ran only when the Sun was shining, the world would be talking a lot less about AI's energy consumption," they wrote. But unfortunately, "that's not the case." In countries like the US, the energy grid used to power data centers is still heavily reliant on fossil fuels, and surging demand for immediate energy are only making that worse. For example, the authors point to Elon Musk's xAI data center outside of Memphis, which is is using 35 methane gas generators to keep its chips humming, rather than wait for approval to draw from the civilian power grid. Unless the industry is made to adopt strategies to mitigate AI's climate impact -- like those outlined in the Paris AI Action Declaration -- this will just be the beginning of a devastating rise in climate-altering emissions.
Share
Copy Link
New research reveals the alarming rise in energy consumption by AI systems, potentially doubling data center power demand by year-end and posing significant challenges to tech companies' climate goals.
Artificial Intelligence (AI) is rapidly becoming a major consumer of electricity, with new research indicating that its energy demand could double by the end of this year. According to a study published in the journal Joule by Alex de Vries-Gao, founder of Digiconomist, AI's energy use already represents up to 20% of global data-center power demand 1. This surge in consumption is poised to surpass the energy demand from bitcoin mining, highlighting the escalating environmental impact of AI technologies.
Source: Wired
Researchers have attempted to quantify the energy consumption of various AI tasks:
Text-based AI: Smaller models like Meta's Llama 3.1 8B use about 114 joules per response, while larger models can consume up to 6,706 joules - equivalent to running a microwave for eight seconds 2.
Image generation: Creating a standard 1024x1024 pixel image requires about 2,282 joules 5.
Video generation: Producing a five-second, 16fps video consumes a staggering 3.4 million joules - comparable to running a microwave for over an hour 2.
The rise of AI is dramatically affecting data center energy consumption. After years of relatively stable power usage, data centers in the United States have seen their electricity consumption double since 2017 4. This trend is expected to continue, with the International Energy Agency (IEA) forecasting that data centers' electricity consumption will grow to more than 900 TWh by the end of the decade 1.
Source: Futurism
The surge in energy demand is complicating tech giants' climate objectives. Google, for instance, has reported a 48% increase in greenhouse gas emissions since 2019, largely attributed to AI integration 1. Microsoft and other major tech companies have also acknowledged that their AI initiatives are endangering their ability to meet internal environmental targets 3.
Accurately assessing AI's energy consumption remains challenging due to several factors:
Lack of transparency: Many companies, including OpenAI and Google, keep details about their AI models' size and reach private 2.
Varying energy sources: The mix of renewable and non-renewable energy sources used by data centers differs by location and time of day 2.
Efficiency techniques: Practices like prompt caching can reduce energy consumption but are not always accounted for in estimates 2.
Source: Mashable
As AI adoption continues to grow, its energy consumption is expected to increase further. By 2028, it's estimated that half the electricity used by data centers will go toward powering AI tools 4. This trend raises concerns about the industry's carbon footprint and its impact on global climate goals.
To address these challenges, experts suggest:
Increased transparency: The EU AI Act requires companies to disclose energy consumption for training AI models, but not for day-to-day use 3.
Efficiency improvements: Innovations in AI models and hardware could reduce computational and energy costs 3.
Renewable energy adoption: Transitioning data centers to renewable energy sources could significantly reduce their environmental impact 5.
As the AI industry continues to evolve, balancing technological advancement with environmental responsibility will be crucial for ensuring a sustainable future.
Summarized by
Navi
[2]
Anthropic launches Claude 4 Opus and Sonnet models, showcasing improved coding abilities, extended reasoning, and autonomous task execution. The new models promise significant advancements in AI technology, particularly in coding and complex problem-solving.
28 Sources
Technology
10 hrs ago
28 Sources
Technology
10 hrs ago
OpenAI announces Stargate UAE, a massive AI infrastructure project in Abu Dhabi, partnering with tech giants and the UAE government to build a 1GW data center cluster, set to begin operations in 2026.
14 Sources
Technology
10 hrs ago
14 Sources
Technology
10 hrs ago
Apple plans to release AI-enabled smart glasses by the end of 2026, featuring cameras, microphones, and speakers for Siri interaction. The move positions Apple to compete with Meta and Google in the growing AI wearables market.
16 Sources
Technology
10 hrs ago
16 Sources
Technology
10 hrs ago
A detailed look at how large language models are creating a digital divide, favoring English speakers and potentially excluding billions of people who speak low-resource languages from the benefits of AI technology.
3 Sources
Technology
18 hrs ago
3 Sources
Technology
18 hrs ago
A study by researchers from the University of Geneva and University of Bern reveals that AI systems, including ChatGPT, outperformed humans in emotional intelligence tests and can generate new EI assessments rapidly.
3 Sources
Science and Research
10 hrs ago
3 Sources
Science and Research
10 hrs ago