Curated by THEOUTPOST
On Sun, 14 Jul, 4:00 PM UTC
4 Sources
[1]
AI's insatiable energy use drives electricity demands - ET Telecom
Internet 5 min read AI's insatiable energy use drives electricity demands AI data centres represent a relatively small additional load on the grid, Gates said. What's more, he predicted that insights gleaned from AI would deliver gains in efficiency that would more than make up for that additional demand. A few weeks ago, I joined a small group of reporters for a wide-ranging conversation with Bill Gates about climate change, its causes and potential solutions. When the topic turned to the issue of just how much energy artificial intelligence was using, Gates was surprisingly sanguine. "Let's not go overboard on this," he said during a media briefing on the sidelines of an event he was hosting in London. AI data centres represent a relatively small additional load on the grid, Gates said. What's more, he predicted that insights gleaned from AI would deliver gains in efficiency that would more than make up for that additional demand. In short, Gates said, the stunning rise of AI will not stand in the way of combating climate change. "It's not like, 'Oh, no, we can't do it because we're addicted to doing chat sessions,'" he said. That's an upbeat assessment from a billionaire with a vested interest in the matter. Gates is a big-time climate investor, and is the former head of Microsoft and remains a major stockholder in the company, which is at the center of the AI revolution. And while it's too early to draw a definitive conclusion on the issue, a few things are already clear: AI is having a profound impact on energy demand around the world, it's often leading to an uptick in planet-warming emissions, and there's no end in sight. AI data centres have a big appetite for electricity. The so-called graphic processing units, or GPUs, used to train large language models and respond to ChatGPT queries, require more energy than your average microchip and give off more heat. With more data centres coming online almost every week, projections about how much energy will be required to power the AI boom are soaring. One peer-reviewed study suggested AI could make up 0.5% of worldwide electricity use by 2027, or roughly what Argentina uses in a year. Analysts at Wells Fargo suggested that U.S. electricity demand could jump 20% by 2030, driven in part to AI. And Goldman Sachs predicted that data centres would account for 8% of U.S. energy usage in 2030, up from just 3% today. "It's truly astronomical potential load growth," said Ben Inskeep, the program director at Citizens Action Coalition, a consumer watchdog group based in Indiana that is tracking the energy impact of data centres. Microsoft, Google, Amazon and Meta have all recently announced plans to build new data centres in Indiana, developments that Inskeep said would strain the grid. "We don't have enough power to meet the projected needs of data centres over the next five to 10 years," he said. "We would need a massive build-out of additional resources." Tech giants are scrambling to get a grip on their energy usage. For a decade now, those same four companies have been at the forefront of corporate efforts to embrace sustainability. But in a matter of months, the energy demands from AI have complicated that narrative. Google's emissions last year were 50% higher than in 2019, largely because of data centres and the rise of AI. Microsoft's emissions also jumped for the same reasons, up 29% last year from 2020. And Meta's emissions jumped 66% from 2021 to 2023. In statements, Google and Microsoft both said that AI would ultimately prove crucial to addressing the climate crisis, and that they were working to reduce their carbon footprints and bring more clean energy online. Amazon pointed to a statement detailing its sustainability efforts. There are two ways for tech companies to meet the demand: tap the existing grid, or build new power plants. Each poses its own challenges. In West Virginia, coal-fired power plants that had been scheduled to retire are being kept online to meet the energy needs of new data centres across the border in Virginia. And across the country, utilities are building new natural-gas infrastructure to support data centres. Goldman Sachs anticipates that "incremental data center power consumption in the U.S. will drive around 3.3 billion cubic feet per day of new natural gas demand by 2030, which will require new pipeline capacity to be built." At the same time, the tech giants are working to secure a lot more power to fuel the growth of AI. Microsoft is working on a $10 billion plan to develop renewable energy to power data centres. Amazon has said it used 100% clean energy last year, though experts have questioned whether the company's accounting was too lenient. All that new low carbon power is great. But when the tech companies themselves are consuming all that electricity to power new AI data centres, pushing up energy demand, it isn't making the grid overall any cleaner. The energy demands from AI are only getting more intense. Microsoft and OpenAI are planning on building a $100 billion data center, according to reports. Initial reporting suggests it may require 5 gigawatts of power, or roughly the equivalent of five nuclear reactors. And at the same time that companies are building more data centres, many of the chips at the heart of the AI revolution are getting more and more power hungry. Nvidia, the leader in AI chips, recently unveiled new products that would draw exponentially more energy from the grid. The AI boom is generating big profits for some companies. And it may yet deliver breakthroughs that help reduce emissions. But, at least for now, data centres are doing more harm than good for the climate. "It's definitely very concerning as we're trying to transition our current grid to renewable energy," Inskeep said. "Adding a massive amount of new load on top of that poses a grave threat to that transition."
[2]
AI's insatiable energy use drives electricity demands
AI data centres have a big appetite for electricity. The so-called graphic processing units, or GPUs, used to train large language models and respond to ChatGPT queries, require more energy than your average microchip and give off more heat.A few weeks ago, I joined a small group of reporters for a wide-ranging conversation with Bill Gates about climate change, its causes and potential solutions. When the topic turned to the issue of just how much energy artificial intelligence was using, Gates was surprisingly sanguine. "Let's not go overboard on this," he said during a media briefing on the sidelines of an event he was hosting in London. AI data centres represent a relatively small additional load on the grid, Gates said. What's more, he predicted that insights gleaned from AI would deliver gains in efficiency that would more than make up for that additional demand. In short, Gates said, the stunning rise of AI will not stand in the way of combating climate change. "It's not like, 'Oh, no, we can't do it because we're addicted to doing chat sessions,'" he said. That's an upbeat assessment from a billionaire with a vested interest in the matter. Gates is a big-time climate investor, and is the former head of Microsoft and remains a major stockholder in the company, which is at the center of the AI revolution. And while it's too early to draw a definitive conclusion on the issue, a few things are already clear: AI is having a profound impact on energy demand around the world, it's often leading to an uptick in planet-warming emissions, and there's no end in sight. AI data centres have a big appetite for electricity. The so-called graphic processing units, or GPUs, used to train large language models and respond to ChatGPT queries, require more energy than your average microchip and give off more heat. With more data centres coming online almost every week, projections about how much energy will be required to power the AI boom are soaring. One peer-reviewed study suggested AI could make up 0.5% of worldwide electricity use by 2027, or roughly what Argentina uses in a year. Analysts at Wells Fargo suggested that U.S. electricity demand could jump 20% by 2030, driven in part to AI. And Goldman Sachs predicted that data centres would account for 8% of U.S. energy usage in 2030, up from just 3% today. "It's truly astronomical potential load growth," said Ben Inskeep, the program director at Citizens Action Coalition, a consumer watchdog group based in Indiana that is tracking the energy impact of data centres. Microsoft, Google, Amazon and Meta have all recently announced plans to build new data centres in Indiana, developments that Inskeep said would strain the grid. "We don't have enough power to meet the projected needs of data centres over the next five to 10 years," he said. "We would need a massive build-out of additional resources." Tech giants are scrambling to get a grip on their energy usage. For a decade now, those same four companies have been at the forefront of corporate efforts to embrace sustainability. But in a matter of months, the energy demands from AI have complicated that narrative. Google's emissions last year were 50% higher than in 2019, largely because of data centres and the rise of AI. Microsoft's emissions also jumped for the same reasons, up 29% last year from 2020. And Meta's emissions jumped 66% from 2021 to 2023. In statements, Google and Microsoft both said that AI would ultimately prove crucial to addressing the climate crisis, and that they were working to reduce their carbon footprints and bring more clean energy online. Amazon pointed to a statement detailing its sustainability efforts. There are two ways for tech companies to meet the demand: tap the existing grid, or build new power plants. Each poses its own challenges. In West Virginia, coal-fired power plants that had been scheduled to retire are being kept online to meet the energy needs of new data centres across the border in Virginia. And across the country, utilities are building new natural-gas infrastructure to support data centres. Goldman Sachs anticipates that "incremental data center power consumption in the U.S. will drive around 3.3 billion cubic feet per day of new natural gas demand by 2030, which will require new pipeline capacity to be built." At the same time, the tech giants are working to secure a lot more power to fuel the growth of AI. Microsoft is working on a $10 billion plan to develop renewable energy to power data centres. Amazon has said it used 100% clean energy last year, though experts have questioned whether the company's accounting was too lenient. All that new low carbon power is great. But when the tech companies themselves are consuming all that electricity to power new AI data centres, pushing up energy demand, it isn't making the grid overall any cleaner. The energy demands from AI are only getting more intense. Microsoft and OpenAI are planning on building a $100 billion data center, according to reports. Initial reporting suggests it may require 5 gigawatts of power, or roughly the equivalent of five nuclear reactors. And at the same time that companies are building more data centres, many of the chips at the heart of the AI revolution are getting more and more power hungry. Nvidia, the leader in AI chips, recently unveiled new products that would draw exponentially more energy from the grid. The AI boom is generating big profits for some companies. And it may yet deliver breakthroughs that help reduce emissions. But, at least for now, data centres are doing more harm than good for the climate. "It's definitely very concerning as we're trying to transition our current grid to renewable energy," Inskeep said. "Adding a massive amount of new load on top of that poses a grave threat to that transition."
[3]
How to make AI's data energy demands more sustainable - Fast Company
The artificial intelligence boom has had such a profound effect on big tech companies that their energy consumption, and with it their carbon emissions, have surged. The spectacular success of large language models such as ChatGPT has helped fuel this growth in energy demand. At 2.9 watt-hours per ChatGPT request, AI queries require about 10 times the electricity of traditional Google queries, according to the Electric Power Research Institute, a nonprofit research firm. Emerging AI capabilities such as audio and video generation are likely to add to this energy demand. The energy needs of AI are shifting the calculus of energy companies. They're now exploring previously untenable options, such as restarting a nuclear reactor at the Three Mile Island power plant that has been dormant since the infamous disaster in 1979. Data centers have had continuous growth for decades, but the magnitude of growth in the still-young era of large language models has been exceptional. AI requires a lot more computational and data storage resources than the pre-AI rate of data center growth could provide.
[4]
NVIDIA consensus suggests a lot of imminent AI energy is needed: Barclays By Investing.com
In a recent thematic investing report, Barclays analysts discussed the energy demands poised to accompany the rise of artificial intelligence (AI) technologies, with a particular focus on NVIDIA's (NASDAQ:NVDA) role in this landscape. According to analysts, the projected energy needs tied to AI advancements underscore a crucial aspect of NVIDIA's market outlook. Barclays's analysis indicates that data centers could consume more than 9% of the current U.S. electricity demand by 2030, driven largely by AI power requirements. The "AI power baked into NVIDIA consensus" is one of the key factors behind this substantial energy forecast, analysts noted. The report also points out that while AI efficiency continues to improve with each new generation of GPUs, the size and complexity of AI models are growing at a rapid pace. For instance, the size of major large language models (LLMs) has been increasing approximately 3.5 times per year. Despite these improvements, the overall energy demand is set to rise due to the expanding scope of AI applications. Each new generation of GPUs, such as NVIDIA's Hopper and Blackwell series, is more energy-efficient. Still, the larger and more complex AI models require substantial computational power. "Large language models (LLMs) require immense computational power for real-time performance," the report writes. "The computational demands of LLMs also translate into higher energy consumption as more and more memory, accelerators, and servers are required to fit, train, and infer from these models." "Organizations aiming to deploy LLMs for real-time inference must grapple with these challenges," Barclays added. To illustrate the scale of this energy demand, Barclays projects that powering approximately 8 million GPUs will require around 14.5 gigawatts of power, translating to roughly 110 terawatt-hours (TWh) of energy. This forecast assumes an 85% average load factor. With about 70% of these GPUs expected to be deployed in the U.S. by the end of 2027, this equates to over 10 gigawatts and 75 TWh of AI power and energy demand in the U.S. alone within the next three years. "NVIDIA's market cap suggests this is just the start of AI power demand deployment," analysts said. The chipmaker's ongoing development and deployment of GPUs are poised to drive significant increases in energy consumption across data centers. Moreover, the reliance on grid electricity for data centers stresses the importance of addressing peak power demands. Data centers operate continuously, necessitating a balanced power supply. The report cites a notable statement from Sam Altman, CEO of OpenAI, at the Davos World Economic Forum, "We do need way more energy in the world than I think we thought we needed before...I think we still don't appreciate the energy needs of this technology."
Share
Share
Copy Link
The rapid advancement of artificial intelligence is driving unprecedented electricity demands, raising concerns about sustainability and the need for innovative solutions in the tech industry.
As artificial intelligence (AI) continues to evolve and expand, its insatiable hunger for energy has become a growing concern for the tech industry and environmentalists alike. The computational power required to train and run AI models is driving electricity demands to new heights, prompting questions about sustainability and the need for innovative solutions 1.
Recent estimates suggest that training a single AI model can consume as much electricity as 100 U.S. households use in an entire year. This staggering energy requirement is primarily due to the massive amount of data processing and complex calculations involved in AI operations 2.
At the heart of this energy consumption are data centers, which serve as the backbone of AI infrastructure. These facilities house thousands of servers and require extensive cooling systems to prevent overheating. As AI applications become more widespread, the demand for data center capacity is skyrocketing, further exacerbating the energy consumption issue 3.
Tech giants are not turning a blind eye to this challenge. Companies like Google, Microsoft, and Amazon are investing heavily in renewable energy sources to power their data centers. They are also exploring innovative cooling technologies and more energy-efficient hardware designs to mitigate the environmental impact of AI 3.
Hardware manufacturers, particularly those specializing in AI chips, are at the forefront of addressing this energy crisis. NVIDIA, a leader in AI hardware, is facing increasing pressure to develop more energy-efficient solutions. The company's future success may hinge on its ability to balance performance with power efficiency 4.
As AI's energy consumption continues to rise, regulatory bodies are beginning to take notice. There are growing calls for stricter energy efficiency standards for AI systems and data centers. The tech industry is now faced with the challenge of maintaining the pace of AI innovation while also addressing these pressing environmental concerns 1.
The rapid advancement of AI technology has brought tremendous benefits across various sectors, from healthcare to finance. However, the energy implications of this progress cannot be ignored. As the world grapples with climate change, finding a balance between technological advancement and environmental sustainability has become more critical than ever 2.
Reference
[1]
[2]
The rapid growth of artificial intelligence is causing a surge in energy consumption by data centers, challenging sustainability goals and straining power grids. This trend is raising concerns about the environmental impact of AI and the tech industry's ability to balance innovation with eco-friendly practices.
8 Sources
8 Sources
As artificial intelligence continues to advance, concerns grow about its energy consumption and environmental impact. This story explores the challenges and potential solutions in managing AI's carbon footprint.
5 Sources
5 Sources
The rapid growth of AI is straining power grids and prolonging the use of coal-fired plants. Tech giants are exploring nuclear energy and distributed computing as potential solutions.
4 Sources
4 Sources
Gartner forecasts that 40% of AI data centers will face operational constraints due to power shortages by 2027, as the rapid growth of AI and generative AI drives unprecedented increases in electricity consumption.
4 Sources
4 Sources
Chinese startup DeepSeek claims to have created an AI model that matches the performance of established rivals at a fraction of the cost and carbon footprint. However, experts warn that increased efficiency might lead to higher overall energy consumption due to the Jevons paradox.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved