10 Sources
[1]
AI Is Eating Data Center Power Demand -- and It's Only Getting Worse
A new analysis of AI hardware being produced and how it is being used attempts to estimate the vast amount of electricity being consumed by AI. AI's energy use already represents as much as 20 percent of global data-center power demand, research published Thursday in the journal Joule shows. That demand from AI, the research states, could double by the end of this year, comprising nearly half of all total data-center electricity consumption worldwide, excluding the electricity used for bitcoin mining. The new research is published in a commentary by Alex de Vries-Gao, the founder of Digiconomist, a research company that evaluates the environmental impact of technology. De Vries-Gao started Digiconomist in the late 2010s to explore the impact of bitcoin mining, another extremely energy-intensive activity, would have on the environment. Looking at AI, he says, has grown more urgent over the past few years because of the widespread adoption of ChatGPT and other large language models that use massive amounts of energy. According to his research, worldwide AI energy demand is now set to surpass demand from bitcoin mining by the end of this year. "The money that bitcoin miners had to get to where they are today is peanuts compared to the money that Google and Microsoft and all these big tech companies are pouring in [to AI]," he says. "This is just escalating a lot faster, and it's a much bigger threat." The development of AI is already having an impact on Big Tech's climate goals. Tech giants have acknowledged in recent sustainability reports that AI is largely responsible for driving up their energy use. Google's greenhouse gas emissions, for instance, have increased 48 percent since 2019, complicating the company's goals of reaching net zero by 2030. "As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute," Google's 2024 sustainability report reads. Last month, the International Energy Agency released a report finding that data centers made up 1.5 percent of global energy use in 2024 -- around 415 terrawatt-hours, a little less than the yearly energy demand of Saudi Arabia. This number is only set to get bigger: Data centers' electricity consumption has grown four times faster than overall consumption in recent years, while the amount of investment in data centers has nearly doubled since 2022, driven largely by massive expansions to account for new AI capacity. Overall, the IEA predicted that data center electricity consumption will grow to more than 900 TWh by the end of the decade. But there's still a lot of unknowns about the share that AI, specifically, takes up in that current configuration of electricity use by data centers. Data centers power a variety of services -- like hosting cloud services and providing online infrastructure -- that aren't necessarily linked to the energy-intensive activities of AI. Tech companies, meanwhile, largely keep the energy expenditure of their software and hardware private. Some attempts to quantify AI's energy consumption have started from the user side: calculating the amount of electricity that goes into a single ChatGPT search, for instance. De Vries-Gao decided to look, instead, at the supply chain, starting from the production side to get a more global picture.
[2]
AI energy usage hard to measure, but MIT Tech Review tried
A single person with a serious AI habit may chew through enough electricity each day to keep a microwave running for more than three hours. And the actual toll may even be worse, as so many companies keep details about their AI models secret. Speaking to two dozen researchers tracking AI energy consumption, and conducting experiments on its own, the MIT Technology Review concluded that the total energy and climate toll is incredibly difficult to get a handle on. But that didn't stop them from trying. Working with researchers from Hugging Face, the authors of the report determined that the energy used for a single query to the open-source Llama 3.1 8B engine used around 57 joules of energy to generate a response. (That 8B means the engine uses 8 billion parameters). When accounting for cooling and other energy demands, the report said that number should be doubled, bringing a single query on that model to around 114 joules - equivalent to running a microwave for around a tenth of a second. A larger model, like Llama 3.1 405B, needs around 6,706 joules per response - eight seconds of microwave usage. In other words, the size of a particular model plays a huge role in how much energy it uses. Although its true size is a mystery, OpenAI's GPT-4 is estimated to have well over a trillion parameters, meaning its per-query energy footprint is likely far higher than the Llama queries tested. It's also worth pointing out that those figures are for text-based responses. AI-generated photos actually use considerably less than text responses thanks to their smaller model size and the fact that diffusion is more energy efficient than inference, MIT TR noted. AI video generation, on the other hand, is an energy sinkhole. In order to generate a five-second long video at 16 frames per second, the CogVideoX AI video generation model consumes a whopping 3.4 million joules of energy - equivalent to running a microwave for an hour or riding 38 miles on an e-bike, Hugging Face researchers told the Tech Review. "It's fair to say that the leading AI video generators, creating dazzling and hyperrealistic videos up to 30 seconds long, will use significantly more energy," the report noted. Using that data, the authors compiled an estimate to look at the daily AI energy consumption of someone with a habit of leaning on generative models for various tasks. Fifteen questions, ten attempts at generating an image, and three tries at making an Instagram-ready five-second video would eat up the aforementioned estimate of 2.9 kWh of electricity - three and a half hours of microwave usage. Hundreds of millions of people around the world use ChatGPT per week, OpenAI estimates. The researchers focused on open-source LLMs that we know a lot about. But companies like OpenAI and Google keep the size and reach of their models hidden from the public, and that that seriously hampers accurate energy usage estimates. When it comes to measuring CO2 emissions, the AI picture gets even more complicated, the Tech Review article notes. The mixture of renewable and non-renewable energy sources varies widely by location and time of day (solar isn't used at night, for instance). The report also didn't touch on prompt caching, a technique commonly used by generative models to store responses and feed them back to users asking the same or similar questions, which can reduce energy consumption for AI models. Regardless of those caveats, one thing is for sure: a lot of energy is being consumed to power the world's growing AI habit. Not only that, but a good portion of it is spewing carbon into the atmosphere for what is arguably questionable usefulness. As the Tech Review report pointed out, the current spike in datacenter energy usage follows years of relatively flat consumptions thanks to steady workloads and ever-increasing efficiency. Datacenters ate up more than a fifth of the electricity in Ireland in 2023. The global energy consumption of datacenters is predicted to more than double from its current rates by 2030, surpassing the energy consumption of the entire nation of Japan by the start of the next decade. AI, naturally, is the largest driver of that increase. There has been a lot of lip service paid to going green by tech companies over the years, who've long assured the public that their bit barns aren't an environmental threat. But now that AI's in the picture, the professed net-zero goals of tech giants like Microsoft and Google are receding into the distance. We've covered this a lot at The Register of late, and our reporting largely aligns with what the MIT tech Review report concluded: The energy behind AI is far dirtier than tech companies would like us to believe. Overall, datacenters are predicted to emit 2.5 billion tons of greenhouse gases by the end of the decade. That's three times more than they would have if generative AI hadn't become the latest craze. To add insult to apocalypse, those numbers rest on a shaky data foundation, as the Tech Review report noted. "This leaves even those whose job it is to predict energy demands forced to assemble a puzzle with countless missing pieces, making it nearly impossible to plan for AI's future impact on energy grids and emissions," they said. ®
[3]
You'll be as annoyed as me when you learn how much energy a few seconds of AI video costs
Data center energy use has doubled since 2017, and AI will account for half ot it by 2028 It only takes a few minutes in a microwave to explode a potato you haven't ventilated, but it takes as much energy as running that microwave for over an hour and more than a dozen potato explosions for an AI model to make a five-second video of a potato explosion. A new study from MIT Technology Review has laid out just how hungry AI models are for energy. A basic chatbot reply might use as little as 114 or as much as 6,700 joules, between half a second and eight seconds, in a standard microwave, but it's when things get multimodal that the energy costs skyrocket to an hour plus in the microwave, or 3.4 million joules. It's not a new revelation that AI is energy-intensive, but MIT's work lays out the math in stark terms. The researchers devised what might be a typical session with an AI chatbot, where you ask 15 questions, request 10 AI-generated images, and throw in requests for three different five-second videos. You can see a realistic fantasy movie scene that appears to be filmed in your backyard a minute after you ask for it, but you won't notice the enormous amount of electricity you've demanded to produce it. You've requested roughly 2.9 kilowatt-hours, or three and a half hours of microwave time. What makes the AI costs stand out is how painless it feels from the user's perspective. You're not budgeting AI messages like we all did with our text messages 20 years ago. Sure, you're not mining bitcoin, and your video at least has some real-world value, but that's a really low bar to step over when it comes to ethical energy use. The rise in energy demands from data centers is also happening at a ridiculous pace. Data centers had plateaued in their energy use before the recent AI explosion, thanks to efficiency gains. However, the energy consumed by data centers has doubled since 2017, and around half of it will be for AI by 2028, according to the report. This isn't a guilt trip, by the way. I can claim professional demands for some of my AI use, but I've employed it for all kinds of recreational fun and to help with personal tasks, too. I'd write an apology note to the people working at the data centers, but I would need AI to translate it for the language spoken in some of the data center locations. And I don't want to sound heated, or at least not as heated as those same servers get. Some of the largest data centers use millions of gallons of water daily to stay frosty. The developers behind the AI infrastructure understand what's happening. Some are trying to source cleaner energy options. Microsoft is looking to make a deal with nuclear power plants. AI may or may not be integral to our future, but I'd like it if that future isn't full of extension cords and boiling rivers. On an individual level, your use or avoidance of AI won't make much of a difference, but encouraging better energy solutions from the data center owners could. The most optimistic outcome is developing more energy-efficient chips, better cooling systems, and greener energy sources. And maybe AI's carbon footprint should be discussed like any other energy infrastructure, like transportation or food systems. If we're willing to debate the sustainability of almond milk, surely we can spare a thought for the 3.4 million joules it takes to make a five-second video of a dancing cartoon almond. As tools like ChatGPT, Gemini, and Claude get smarter, faster, and more embedded in our lives, the pressure on energy infrastructure will only grow. If that growth happens without planning, we'll be left trying to cool a supercomputer with a paper fan while we chew on a raw potato.
[4]
AI could account for nearly half of data centre power usage 'by end of year'
Artificial intelligence systems forecast to require almost as much energy by end of decade as Japan uses today Artificial intelligence systems could approach accounting for half of data centre power consumption by the end of this year, according to new analysis. The estimates by Alex de Vries-Gao, the founder of the Digiconomist tech sustainability website, came as the International Energy Agency (IEA) forecast that AI would require almost as much energy by the end of this decade as Japan uses today. De Vries-Gao's calculations, to be published in the sustainable energy journal Joule, are based on the power consumed by chips made by Nvidia and Advanced Micro Devices that are used to train and operate AI models. The paper also takes into account the energy consumption of chips used by other companies such as Broadcom. The IEA estimates that all data centres - excluding mining for crypto currencies - consumed 415 terawatt hours (TWh) of electricity last year. De Vries-Gao argues in his research that AI could already account for 20% of that total. De Vries-Gao said a number of variables came into his calculations, such as the energy efficiency of a data centre and electricity consumption related to cooling systems for servers handling an AI system's busy workloads. Data centres are the central nervous system of AI technology, with their high energy demands making sustainability a key concern in the development and use of artificial intelligence systems. By the end of 2025, De Vries-Gao estimates, energy consumption by AI systems could approach up to 49% of total data centre power consumption, again excluding crypto mining. AI consumption could reach 23 gigawatts (GW), the research estimates, twice the total energy consumption of the Netherlands. However, De Vries-Gao said a number of factors could lead to a slowdown in hardware demand, such as waning demand for applications such as ChatGPT. Another issue could be geopolitical tensions resulting in constraints on producing AI hardware, such as export controls. De Vries-Gao cites the example of barriers on Chinese access to chips, which contributed to the release of the DeepSeek R1 AI model that reportedly used fewer chips. De Vries-Gao said: "These innovations can reduce the computational and energy costs of AI." But he said any efficiency gains could encourage even more AI use. Multiple countries attempting to build their own AI systems - a trend known as "sovereign AI" - could also increase hardware demand. De Vries-Gao also pointed to a US data centre startup, Crusoe Energy, securing 4.5GW of gas-powered energy capacity for its infrastructure, with the ChatGPT developer OpenAI among the potential customers through its Stargate joint venture. "There are early indications that these [Stargate] data centres could exacerbate dependence on fossil fuels," writes De Vries-Gao. On Thursday OpenAI announced the launch of a Stargate project in the United Arab Emirates, the first outside the US. Microsoft and Google admitted last year that their AI drives were endangering their ability to meet internal environmental targets. De Vries-Gao said information on AI's power demands had become increasingly scarce, with the analyst describing it as an "opaque industry". The EU AI Act requires AI companies to disclose the energy consumption behind training a model but not for its day-to-day use. Prof Adam Sobey, the mission director for sustainability at the UK's Alan Turing Institute, an AI research body, said more transparency was needed on how much energy is consumed by artificial intelligence systems - and how much they could potentially save by helping make carbon-emitting industries such as transport and energy more efficient. Sobey said: "I suspect that we don't need many very good use cases [of AI] to offset the energy being used on the front end."
[5]
Why don't AI companies talk about energy usage?
If you've used ChatGPT recently (and statistically, you probably have) you're part of a global trend. OpenAI's chatbot is estimated to be the fifth most visited website in the world, with more than 400 million users a week. And that's just one AI tool. As generative AI becomes embedded into apps, search engines, workplaces and daily habits, our interactions with large language models (LLMs) like ChatGPT, Google Gemini and Claude are only increasing. We've become more aware of AI's risks, from misinformation and deepfakes to surveillance and emotional dependence. But one of the biggest is AI's environmental impact. Running LLMs requires enormous amounts of electricity and water. These models consume energy not just during training, when they absorb and organize vast volumes of data, but every time you ask a question. That's billions of queries a day, each one demanding computational power and adding to a growing environmental cost. The truth is, we don't know how much energy AI really uses and that's a big problem. Unlike most industries, AI companies aren't required to report the environmental footprint of their models. There's no standardized regulation or reporting framework in place for the energy use or carbon emissions tied specifically to AI systems. There are a few reasons for that. First, the technology is still relatively new, so the infrastructure for this sort of regulation and reporting hasn't caught up. But tech companies also haven't pushed for it. That's partly because AI is a fiercely competitive space. Which means that sharing energy data could inadvertently reveal details about a model's size, architecture or efficiency. It's also technically difficult. AI systems are spread across vast server farms, multiple teams and shared infrastructure, which makes it hard to isolate and track usage. Then there's the optics. Companies heavily invested in the narrative that AI will only do us all vast amounts of good don't want to be linked to sky-high emissions or the guzzling of finite resources. So, with little transparency, researchers and journalists are left to estimate. And those estimates are alarming. Many credible estimates have been made over the past few years. But a recent report from MIT Technology Review offers one of the clearest pictures yet of AI's growing appetite for electricity and water. The report is filled with striking comparisons. For example, generating a 5 second AI video might use as much energy as running a microwave for an hour. Even simple chatbot replies can vary widely in energy consumption. One estimate puts a basic reply at anywhere between 114 and 6,700 joules, which is equivalent to running a microwave for between half a second and eight seconds. But as tasks become more complex - like those that involve images or video - the energy cost rises dramatically. According to the report, the bigger picture is even more concerning. In 2024, US data centers consumed around 200 terawatt-hours of electricity, which is roughly the same as Thailand's entire annual consumption. And that number is climbing fast. By 2028, researchers estimate that AI-related electricity use alone could reach up to 326 terawatt-hours per year. That's more than all of the current data center usage in the US and enough to power more than 22% of American households annually. In carbon terms, that's like the equivalent to driving more than 300 billion miles, which works out at about 1,600 round trips to the sun. It's not just about power either. AI infrastructure also consumes vast amounts of water, primarily for cooling. In some regions, this adds strain to already stretched water supplies, which is a serious concern during heatwaves and droughts. Experts say that one of the biggest challenges here is scale. Even if we had precise figures today, we'd still be underestimating the problem in a year or even in a month's time. That's because the way we use AI is evolving rapidly. Generative models are being built into everyday tools, from writing apps and customer service bots to photo-editing software and search engines. As this adoption accelerates, without a clear understanding of the costs, the environmental impact is likely to spiral much, much faster than we ever expected. The good news is there is growing momentum to make AI more accountable for its environmental footprint. But right now, transparency is the exception and not the rule. For example, the Green Software Foundation (GSF), a global non-profit organization including Microsoft, Cisco, Siemens, Google and other companies, is one of the groups leading the charge. Through its Green AI Committee, the GSF is developing sustainability standards that are designed specifically for AI. This includes lifecycle carbon accounting, open-source tools for tracking energy usage and real-time carbon intensity metrics, which are all aimed at making AI's environmental impact measurable, reportable and (hopefully) manageable. Policy frameworks are also taking shape in some regions. For example, the EU's AI Act encourages sustainability through risk assessments. While the UK's AI Opportunities Action Plan and the British Standards Institution (BSI) are creating technical guidance on how to measure and report AI's carbon footprint. These are early steps but they could help to inform future regulation. Some AI companies are taking steps in the right direction too, investing in renewable energy, researching more efficient training methods and developing improved cooling infrastructure. But these improvements aren't standard across the industry yet and there's still no broadly accepted approach. That's why transparency matters. Without clear and open data about how much energy these systems consume, we can't accurately assess the cost of AI or hold the right companies accountable. We certainly can't build more sustainable policy or infrastructure around it either. Tech companies can't keep asking us to trust in the future of AI while hiding the true cost of running it.
[6]
Report: Creating a 5-second AI video is like running a microwave for an hour
You've probably heard that statistic that every search on ChatGPT uses the equivalent of a bottle of water. And while that's technically true, it misses some of the nuance. The MIT Technology Review dropped a massive report that reveals how the artificial intelligence industry uses energy -- and exactly how much energy it costs to use a service like ChatGPT. The report determined that the energy cost of large-language models like ChatGPT cost anywhere from 114 joules per response to 6,706 joules per response -- that's the difference between running a microwave for one-tenth of a second to running a microwave for eight seconds. The lower-energy models, according to the report, use less energy because they uses fewer parameters, which also means the answers tend to be less accurate. It makes sense, then, that AI-produced video takes a whole lot more energy. According to the MIT Technology Report's investigation, to create a five-second video, a newer AI model uses "about 3.4 million joules, more than 700 times the energy required to generate a high-quality image". That's the equivalent of running a microwave for over an hour. The researchers tallied up the amount of energy it would cost if someone, hypothetically, asked an AI chatbot 15 questions, asked for 10 images, and three five-second videos. The answer? Roughly 2.9 kilowatt-hours of electricity, which is the equivalent of running a microwave for over 3.5 hours. The investigation also examined the rising energy costs of the data centers that power the AI industry. The report found that prior to the advent of AI, the electricity usage of data centers was largely flat thanks to increased efficiency. However, due to energy-intensive AI technology, the energy consumed by data centers in the United States has doubled since 2017. And according to government data, half the electricity used by data centers will go toward powering AI tools by 2028. This report arrives at a time in which people are using generative AI for absolutely everything. Google announced at its annual I/O event that it's leaning into AI with fervor. Google Search, Gmail, Docs, and Meet are all seeing AI integrations. People are using AI to lead job interviews, create deepfakes of OnlyFans models, and cheat in college. And all of that, according to this in-depth new report, comes at a pretty high cost.
[7]
AI's energy appetite is real -- but so is its climate potential
AI is guzzling energy. Scientists estimate North American data centers' power requirements increased nearly 100% from 2022 to 2023, largely driven by generative AI (GenAI). By 2026, they anticipate data centers to become the fifth largest electricity consumers in the world, exceeding the usage of most countries. However, speculation about AI's detrimental effects on the environment might be overblown. For many corporations, particularly those that produce or sell physical goods, AI technology makes up only a small portion of their overall emissions. Oversimplifying AI as "carbon-intensive" diverts attention from its impactful sustainability opportunities. When used wisely, AI has the potential to offset its own footprint and actively contribute to a greener future. AI's carbon output is primarily measured through data center energy consumption. These algorithms, especially GenAI, require significant computational power for training and operation. As usage grows, so does the electricity drain. These impacts are significant. However, the belief that AI will remain an exponential data hog ignores the rapid pace of innovation in model design, hardware, deployment and the transition to renewable energy. Today's algorithms are likely the most inefficient they will ever be. Techniques like model distillation are becoming more prevalent, creating smaller, more energy-conscious models, and manufacturers are designing more energy-efficient AI chips. Additionally, the energy grid is getting greener, translating to fewer emissions from data centers. Consider these factors: According to the World Resources Institute, renewables outpaced other energy generation sources, accounting for 90% of the United States' new installed capacity in 2024. The International Renewable Energy Agency states that more than 80% of renewable capacity additions produce cheaper electricity than fossil fuel alternatives. BloombergNEF reported that more than 40% of the world's electricity came from zero-carbon sources in 2023. Major companies, including Google, Microsoft and Amazon, are investing in clean energy to power their growing data centers. Experts predict that economics alone could drive renewables to account for 50% of electricity by the end of the decade. Meaningful government policies could accelerate that transition. This momentum makes me optimistic that we can mitigate the environmental impacts of AI use. AI emissions also attract significant attention because they are easy to track. Unlike the complex, fragmented emissions from manufacturing and global supply chains, AI's carbon footprint stems primarily from data centers, which are fixed physical locations with measurable electricity consumption. This creates clear accountability, as we can directly attribute these emissions to specific technology providers and data center operators. AI's traceability can skew public and corporate attention toward it over other potentially more significant sources of emissions that are harder to quantify. For many companies, addressing only AI emissions is a drop in the bucket. To make meaningful progress on climate goals, organizations must work to reduce carbon emissions across all business operations, including their value chain. Focusing solely on AI's carbon footprint misses the opportunity to unlock new reduction and efficiency opportunities. Efficiency improvements, often the first step in corporate decarbonization, can be amplified through AI. For example, predictive maintenance prevents energy-wasting malfunctions and extends equipment's life span. Optimizing logistics and supply chains reduces transportation distances and fuel consumption. Intelligently adjusting energy consumption, distribution and storage can maximize efficiency and resource utilization -- all while minimizing costs as well. AI is also a powerful enabler for sustainability professionals. AI can support routine tasks like data collection, reporting and drafting communications so teams with limited resources can focus on impactful strategic efforts. These benefits extend to more complex sustainability initiatives, like supply chain decarbonization. AI-powered solutions can inform business planning by aggregating and analyzing supplier data at scale. Teams can quickly detect trends, highlight emissions hotspots and track progress to prioritize action on the most urgent and impactful reduction opportunities. For example, rather than focusing on broad procurement policies, organizations can directly engage suppliers responsible for a disproportionate amount of emissions, resulting in more impactful reductions. Predictive modeling enables companies to forecast emissions trends, identify future risks and calculate the impacts of different decarbonization strategies for proactive, long-term business planning and supply chain resilience. As sustainability becomes more integrated across different business functions, AI will help organizations efficiently incorporate these initiatives into their everyday work. AI will not solve climate issues on its own; it's a tool to amplify human efforts. Algorithms are only as good as the data they use. Emissions data -- especially from value chains -- can be sparse, inconsistent or incomplete. AI won't meaningfully fill the gaps, but it will guide teams in their decarbonization strategy. In addition, many AI models are black boxes. This lack of transparency poses a serious problem for emissions reporting, where audibility and traceability are essential. Auditors, investors and regulators need to see the underlying methodology. AI's conclusion may be accurate, but it can't be the foundation of reporting if companies can't explain it. However, we can't let perfection be the enemy of good. If AI helps you do your job more effectively, and your job is helping decarbonize the planet, then use it. We can't discount AI entirely based on its carbon emissions. Every technology has tradeoffs; anyone in sustainability knows this fact all too well. Sustainability professionals should leverage AI's decarbonization potential while understanding the adverse effects. In the broader context of climate action, AI's energy demands are a challenge -- but not the biggest one we face. We've featured the best green web hosting.
[8]
How Much Electricity It Actually Takes to Use AI May Surprise You
By now, most of us should be vaguely aware that artificial intelligence is hungry for power. Even if you don't know the exact numbers, the charge that "AI is bad for the environment" is well-documented, bubbling from sources ranging from mainstream press to pop-science YouTube channels to tech trade media. Still, the AI industry as we know it today is young. Though startups and big tech firms have been plugging away on large language models (LLMs) since the 2010s, the release of consumer generative AI in late 2022 brought about a huge increase in AI adoption, leading to an unprecedented "AI boom." In under three years, AI has come to dominate global tech spending in ways researchers are just starting to quantify. In 2024, for example, AI companies nabbed 45 percent of all US venture capital tech investments, up from only nine percent in 2022. Medium-term, big-name consultant firms like McKinsey expect AI infrastructure spending to grow to $6.7 trillion by 2030; compare this to just $450 billion in 2022. That being the case, research on AI's climate and environmental impacts can seem vague and scattered, as analysts race to establish concrete environmental trends in the extraordinary explosion of the AI industry. A new survey by MIT Technology Review is trying to change that. The authors spoke to two dozen AI experts working to uncover the tech's climate impact, combed "hundreds of pages" of data and reports, and probed the top developers of LLM tools in order to provide a "comprehensive look" at the industry's impact. "Ultimately, we found that the common understanding of AI's energy consumption is full of holes," the authors wrote. That led them to start small, looking at the energy use of a single LLM query. Beginning with text-based LLMs, they found that model size directly predicted energy demand, as bigger LLMs use more chips -- and therefore more energy -- to process questions. While smaller models like Meta's Llama 3.1 8B used roughly 57 joules per response (or 114 joules when the authors factored for cooling power and other energy needs), larger units needed 3,353 joules (or 6,706), or in MIT Tech's point of reference, enough to run a microwave for eight seconds. Image-generating AI models, like Stable Diffusion 3 Medium, needed 1,141 joules (or 2,282) on average to spit out a standard 1024 x 1024 pixel image -- the type that are rapidly strangling the internet. Doubling the quality of the image likewise doubles the energy use to 4,402 joules, worth over five seconds of microwave warming time, still less than the largest language bot. Video generation is where the sparks really start flying. The lowest-quality AI video software, a nine-month old version of Code Carbon, took an eye-watering 109,000 joules to spew out a low-quality, 8fps film -- "more like a GIF than a video," the authors noted. Better models use a lot more. With a recent update, that same tool takes 3.4 million joules to spit out a five-second, 16fps video, equivalent to running a microwave for over an hour. Whether any of those numbers amount to a lot or a little is open to debate. Running the microwave for a few seconds isn't much, but if everybody starts doing so hundreds of times a day -- or in the case of video, for hours at a time -- it'll make a huge impact in the world's power consumption. And of course, the AI industry is currently trending toward models that use more power, not less. Zooming out, the MIT Tech survey also highlights some concerning trends. One is the overall rise in power use correlating to the rise of AI. While data center power use remained mostly steady across the US between 2005 and 2017, their power consumption doubled by 2023, our first full year with mass-market AI. As of 2024, 4.4 percent of all energy consumed in the US went toward data centers. Meanwhile, data centers' carbon intensity -- the amount of iceberg-melting exhaust spewed relative to energy used -- became 48 percent higher than the US average. All that said, the MIT authors have a few caveats. First, we can't look under the hood at closed-source AI models like OpenAI's ChatGPT, and most of the leading AI titans have declined to join in on good-faith climate mapping initiatives like AI Energy Score. Until that changes, any attempt to map such a company's climate impact is a stab in the dark at best. In addition, the survey's writers note that data centers are not inherently bad for the environment. "If all data centers were hooked up to solar panels and ran only when the Sun was shining, the world would be talking a lot less about AI's energy consumption," they wrote. But unfortunately, "that's not the case." In countries like the US, the energy grid used to power data centers is still heavily reliant on fossil fuels, and surging demand for immediate energy are only making that worse. For example, the authors point to Elon Musk's xAI data center outside of Memphis, which is is using 35 methane gas generators to keep its chips humming, rather than wait for approval to draw from the civilian power grid. Unless the industry is made to adopt strategies to mitigate AI's climate impact -- like those outlined in the Paris AI Action Declaration -- this will just be the beginning of a devastating rise in climate-altering emissions.
[9]
Is AI growing faster than we can power it? MIT study warns of explosive energy demands in a machine-led future
The digital intelligence boom has a dirty secret -- AI's growing carbon footprint. As billions of users interact with AI daily, data centres are guzzling fossil-fuel-powered electricity around the clock. Researchers caution that this surge could soon rival emissions from entire nations, raising serious concerns about AI's long-term sustainability in the face of accelerating climate change. Artificial Intelligence may be the sharpest tool in humanity's digital shed, but behind its sleek interface lies a growing climate conundrum. From helping us choose our next binge-worthy show to whispering sweet nothings as a virtual romantic partner, AI is rapidly becoming an inseparable part of everyday life. But what powers this "magic" comes with a carbon footprint big enough to leave scorch marks on the planet. A new investigation by MIT Technology Review has pulled back the curtain on the escalating environmental cost of AI -- and the findings are as alarming as they are eye-opening. Experts now suggest that what we're seeing today might just be the calm before a very energy-hungry storm. For every chatbot reply or AI-generated painting, there's a surge of electricity flowing through data centres that rarely sleep. And according to Professor Sajjad Moazeni, asking ChatGPT a single question may use 10 to 100 times more energy than sending an email. Multiply that by the billion messages ChatGPT receives daily, and you begin to understand the scale. And it doesn't stop there. OpenAI's Sam Altman admitted that even the "politeness" in our prompts costs tens of millions of dollars in operational expenses. AI systems like Google's Gemini and image generators that churn out 78 million images a day aren't just consuming bandwidth -- they're devouring energy. The MIT report reveals that by 2028, over half of all electricity used by data centres could go directly into powering AI. That translates to between 165 and 326 terawatt-hours annually -- more electricity than what all U.S. data centres currently use for everything. It's also enough to power nearly a quarter of all American homes. To put it in a wilder perspective: this energy use would emit the same carbon as 1,600 round trips from Earth to the Sun in a car. It's a statistic so surreal it almost feels fictional -- except it's not. AI infrastructure isn't just greedy -- it's relentless. "AI data centres need constant power, 24-7, 365 days a year," said Rahul Mewawalla, CEO of Mawson Infrastructure Group. And despite the optimism around renewables, the bulk of that power still comes from fossil fuels. As AI adoption accelerates, so does the dependency on energy grids that are far from green. This has led to serious concerns from environmentalists. "It's not clear to us that the benefits of these data centres outweigh these costs," said Eliza Martin of Harvard's Environmental and Energy Law Program. "Why should we be paying for this infrastructure? Why should we be paying for their power bills?" The AI revolution is pushing boundaries, but it's also pushing climate scientists to the brink of panic. With global warming already spinning out of control, the sudden explosion of energy-intensive AI tools adds a new layer to an already urgent crisis. If the trajectory continues unchecked, this hidden environmental tax may soon become too heavy to ignore. While AI may promise smarter futures, the question remains: at what cost? And perhaps more pressingly -- was today's AI energy footprint the smallest it will ever be? If so, the future could be brighter for tech but bleaker for the planet.
[10]
Optimizing AI's Impact: How ICT Providers Can Lead in Sustainable Data Centre Energy Use
By Pinkesh Kotecha Just a few years ago, artificial intelligence (AI) was mostly theoretical -- an intriguing idea with potential. Today, it's omnipresent -- shaping business decisions, automating services, and powering innovation at lightning speed. From large language models and generative tools to predictive analytics and autonomous systems, the global appetite for AI applications is insatiable, and so is their energy consumption. As AI gets smarter, our data centres work harder, and that's proving to be a major energy guzzler. Experts suggest that 1 MW data centre powered by thermal energy can emit a staggering 8,760 metric tons of CO₂ annually. As organizations scramble to unlock AI's competitive advantages, the environmental trade-offs are becoming too significant to ignore. In a recent EY survey of senior leaders across sectors, 64% expressed concern that increased AI usage may derail their sustainability and emissions goals -- a sign that the industry is approaching an inflection point. The Telecom Regulatory Authority of India (TRAI) has urged a systemic shift, recommending that the Ministry of Power facilitate energy banking provisions annually for data centres and data centre parks that generate renewable energy. Such incentives could accelerate the adoption of green power and reduce dependence on carbon-heavy grids. Moving ahead, as generative AI consumes more power than traditional AI tasks, Indian data centre energy demand could increase by six times by 2030. This rise poses a crucial question: Can AI's promise of optimization coexist with sustainable energy use? Well, this is the paradox of AI -- while it holds the potential to make industries more efficient, automate processes, and reduce carbon footprints across sectors, it is itself an energy-intensive technology. The training and deployment of AI models require vast computational resources, placing immense pressure on existing IT infrastructure. This duality highlights a critical inflection point for Information and Communications Technology (ICT) providers, who now have the responsibility of not only powering digital transformation but doing so sustainably. ICT Providers at the Helm of Digital and Sustainable/Green Transformation ICT providers have become central to digital transformation journeys across industries, enabling organizations to modernize operations, embrace cloud computing, and integrate AI into everyday business functions. However, as digital enablers, they also carry the responsibility of ensuring that the infrastructure they offer is aligned with sustainability goals. Today's ICT leaders must go beyond enabling connectivity and computational capabilities, they must become stewards of sustainable growth. From the design and location of data centres to the selection of power sources and energy-efficient hardware, every decision made by an ICT provider can tip the scales toward or away from environmental sustainability. Key Strategies to Optimize Energy Use in Data Centres AI-Driven Energy Management: Ironically, AI can help optimize its own footprint. Machine learning algorithms can be used to predict peak loads, dynamically allocate resources, and automate energy-efficient workload distribution across data centres. Transition to Energy-Efficient Hardware: Replacing legacy servers with modern, AI-optimized processors -- such as GPUs, FPGAs, TPUs, and purpose-built ASICs -- can dramatically improve computational efficiency while reducing power consumption. These components are designed to handle high workloads with lower latency and less energy input. Adopt Advanced Cooling Techniques: Cooling systems account for nearly 40% of a data centre's energy usage. Newer solutions like liquid immersion cooling, direct-to-chip cooling, and hot/cold aisle containment systems are significantly more efficient than traditional air-based cooling. These systems reduce energy consumption and also extend the lifespan of critical hardware. Green Power Procurement: Sourcing renewable energy, either directly or through power purchase agreements (PPAs), can help ICT providers reduce the carbon intensity of their operations. Optimize Data Centre Design and Siting: Locating data centres in naturally cool climates reduces the need for mechanical cooling, while modular and scalable designs can improve energy efficiency. Hyper scale data centres often benefit from economies of scale, enabling them to integrate sustainability into their operations from the ground up. Promote Circular Economy Practices: By refurbishing servers, responsibly disposing of e-waste, and recycling rare earth metals from outdated equipment, ICT leaders can lower the environmental impact of their infrastructure and reduce resource extraction. Greening the Digital Frontier Despite the array of solutions, challenges remain -- from high upfront costs to inconsistent access to renewable energy and the delicate balance between performance and sustainability. But these obstacles also pave way for innovation, collaboration and bold leadership for ICT players. Emerging trends such as edge computing, decentralized AI, and federated learning may also help reduce the energy load on centralized data centres by distributing workloads closer to the source of data generation. Additionally, the use of digital twins and AI simulations can help ICT companies' model energy performance and optimize system design. The way forward lies in embedding sustainability into the very architecture of digital transformation. With the right vision and commitment, ICT players can drive not only a smarter world but a greener one too. (The author is Pinkesh Kotecha, MD and Chairman of Ishan Technologies, and the views expressed in this article are his own)
Share
Copy Link
New research reveals AI's rapidly increasing energy demands, with estimates suggesting it could account for nearly half of data center power usage by the end of 2025. This surge in consumption raises significant environmental concerns and calls for greater transparency from AI companies.
Artificial Intelligence (AI) is rapidly becoming one of the most significant consumers of energy in the tech sector. Recent research indicates that AI's power demand could account for up to 20% of global data center electricity consumption, with projections suggesting this figure may double by the end of 2025 14. This surge in energy use is primarily driven by the widespread adoption of large language models (LLMs) like ChatGPT, which has an estimated 400 million weekly users 5.
Source: Economic Times
Researchers have attempted to quantify the energy consumption of various AI tasks:
A typical AI user session, including 15 text queries, 10 image generations, and 3 video creations, could consume about 2.9 kWh of electricity – equivalent to running a microwave for three and a half hours 23.
The International Energy Agency (IEA) reports that data centers accounted for 1.5% of global energy use in 2024, consuming around 415 terawatt-hours 1. This figure is expected to more than double by 2030, potentially surpassing 900 TWh 1. AI is a primary driver of this increase, with estimates suggesting it could require almost as much energy as Japan uses today by the end of the decade 4.
Source: Futurism
The rapid growth in AI energy consumption poses significant environmental challenges:
A major issue in addressing AI's environmental impact is the lack of transparency from AI companies regarding their energy usage 45. There are several reasons for this opacity:
This lack of transparency makes it challenging for researchers and policymakers to accurately assess and address the environmental impact of AI systems 5.
Source: The Register
Despite these challenges, there are growing efforts to make AI more environmentally accountable:
However, these initiatives are not yet standard across the industry, and there remains a pressing need for greater transparency and regulation to ensure the sustainable development of AI technology.
AMD reveals its new Instinct MI350 and MI400 series AI chips, along with a comprehensive AI roadmap spanning GPUs, networking, software, and rack architectures, in a bid to compete with Nvidia in the rapidly growing AI chip market.
18 Sources
Technology
20 hrs ago
18 Sources
Technology
20 hrs ago
Google DeepMind has launched Weather Lab, an interactive website featuring AI weather models, including an experimental tropical cyclone model. The new AI system aims to improve cyclone predictions and is being evaluated by the US National Hurricane Center.
8 Sources
Technology
20 hrs ago
8 Sources
Technology
20 hrs ago
Meta's new AI app is facing criticism for its "Discover" feature, which publicly displays users' private conversations with the AI chatbot, often containing sensitive personal information.
6 Sources
Technology
20 hrs ago
6 Sources
Technology
20 hrs ago
A major Google Cloud Platform outage affected numerous AI services and popular platforms, highlighting the vulnerabilities of cloud-dependent systems and raising concerns about the resilience of digital infrastructure.
3 Sources
Technology
4 hrs ago
3 Sources
Technology
4 hrs ago
Harvard University and other libraries are releasing vast collections of public domain books and documents to AI researchers, providing a rich source of cultural and historical data for machine learning models.
6 Sources
Technology
20 hrs ago
6 Sources
Technology
20 hrs ago