Curated by THEOUTPOST
On Tue, 15 Oct, 12:02 AM UTC
4 Sources
[1]
AI datacenters are keeping coal-fired power stations busy
More evidence has emerged that AI-driven demand for energy to power datacenters is prolonging the life of coal-fired plants in the US. With AI still the hot new trend, demand for compute to operate it is pushing the growth of bit barn capacity along with the need for ever more energy to power it all. This growth is having some unintended effects, at least in America. In Omaha, one power company has had to abandon plans to stop burning coal to produce electricity because of the need to serve demand from nearby datacenters, according to The Washington Post, picking out Google and Meta in particular. It claims that rising energy demands from those facilities mean that two coal-burning generators at the North Omaha power plant cannot be decommissioned without risking a power shortage for that district. This is despite previous undertakings from the operator, the Omaha Public Power District, that it would cease burning coal in order to improve air quality in the surrounding area, endangering public health as well as continuing to spew out greenhouse gas emissions. This isn't an isolated incident. The Register reported earlier this year that coal-generated energy was being given a reprieve in the US thanks to the growing energy demands of datacenters, not helped by developers determined to throw as many high-performance servers packed with energy-guzzling GPUs as possible at training their generative AI models. This growth has caught out many utility companies, which have until now faced flat or shrinking demand in the US market and planned their infrastructure investments accordingly. A report published last week by management consultants Bain & Company warned that unless they adapted quickly to expand generation and supply capacity by up to 26 percent, energy use might outstrip supply within a couple of years. It also warned that datacenter operators might seek alternative sources of power if utility companies don't move quickly enough, but this already seems to be happening, as can be seen from various reports of tech companies showing interest in nuclear power. Oracle is one of the more recent examples, disclosing during its latest earnings call that the company has secured building permits for a trio of small modular reactors (SMRs) to power a bit barn with over a gigawatt of AI compute capacity. Earlier this year, Amazon Web Services (AWS) revealed that it had taken possession of a datacenter campus owned by Talen Energy built next to its Susquehanna nuclear power plant in northeast Pennsylvania. Last month, Microsoft announced a deal that will see the Three Mile Island nuclear plant come back online, thanks to a 20-year power purchase agreement (PPA) with Constellation Energy, which owns the facility. However, some of these projects are likely to be long-term, with one analyst telling us a while back that the most optimistic time frame for deployment of technology such as SMRs in the United States is by 2030. If cleaner sources of energy cannot be found, power companies will likely continue to rely on fossil fuels such as coal and gas to meet the growing demand from bit barns and other applications such as electric vehicles. Financial services giant Morgan Stanley last month published its own report on the datacenter industry, warning that global greenhouse emissions between now and the end of the decade are likely to be three times higher than if generative AI had not been developed. ®
[2]
The next winner of the AI boom is also one of its biggest problems
Despite the environmental toll, experts say data and other artificial intelligence infrastructure will be the winners of AI's next phase as companies seek to power their growing AI offerings. While AI chips developed by companies such as Nvidia and AMD are crucial to the current phase of AI development, the broader data center industry is "very well positioned" to be at the center of the next phase of AI expansion, Tejas Dessai, director of research at Global X, told Quartz. "Where I think you have the highest probability of naming winners is in the picks and shovels category -- who's building the infrastructure that's going to power all this," Rowan Trollope, chief executive of data platform Redis, told Quartz. "No matter who, what app, or what model wins, we sit in the middle and we make them all better." As companies deploy more AI clusters, there could be "an uptake in memory solutions, in storage solutions, in networking solutions," Dessai said, followed by companies specialized in comprehensive data center solutions. "Currently, the principle that many of these companies are operating with is: The more the number of chips that we can put together in terms of training these models, the smarter and smarter the outcome is that we can get out of these models," Dessai said in a separate interview with Quartz. But he says there are still "a lot of physical constraints" such as GPU clusters, which run on hundreds of thousands of chips. And data centers can take years to come online, meaning there is still some "running behind" when it comes to having enough capacity for AI workloads. "We're still very early in that cycle and you see companies like OpenAI trying to get access to data center capacity from anywhere and everywhere that they can find it," Dessai said. Sarah Friar, chief technology officer at OpenAI, reportedly told the startup's shareholders that its partner and investor, Microsoft, was too slow at supplying it with enough computing power. After the company completed its $6.6 billion funding round, the startup's leaders told some employees that it would start leaning less on Microsoft for data centers and AI chips, according to The Information, which cited unnamed people familiar with the matter. In the short to medium term, Dessai said he sees data centers remaining attractive to companies as construction increases and vacancy rates reach "an all-time low." "Companies are really wanting to buy pretty much any capacity that they can find their hands on," Dessai said. "Data is the big winning category," Trollope said. "We don't know who in the data category is going to win, but I think the incumbents have a really good shot." Microsoft, which set a goal in 2020 to be "carbon negative" by the end of the decade, said in May that its carbon emissions are almost 31% higher than when it set the commitment. The increase was mostly due to building data centers, it said, as well as hardware like semiconductors and servers. Meanwhile, Google said in July that its carbon emissions have risen by 48% since 2019, and were up 13% year-over-year in 2023 -- mostly because of data center energy consumption. The tech giant set a goal in 2021 to reach net-zero emissions across its operations and value chain by 2030. By the end of the decade, data centers could consume up to 9% of electricity in the U.S. -- more than double what is being used now, according to the Electric Power Research Institute. In April, Ami Badani, chief marketing officer of British chip designer Arm, said data centers powering AI chatbots such as OpenAI's ChatGPT account for 2% of global electricity consumption. One query on ChatGPT needs almost 10 times as much electricity as a Google search, according to a study by Goldman Sachs (GS-0.16%). That level of energy demand, Badani said, could eventually slow down AI progress. Tech giants are seemingly taking note of this obstacle. In July, the Wall Street Journal reported that a third of the U.S.'s nuclear power plants were discussing deals with tech companies to supply electricity for data centers. "We have to make up for this energy deficit in one way or another," Dessai said. "We can't burn more coal, we can't use more fossil fuels. So naturally nuclear energy becomes the answer to it." As big tech seems to be moving toward nuclear energy, "we'll continue to see more deals in that direction," he said. Earlier this week, Google announced that it was signing "the world's first corporate agreement to purchase nuclear energy" from Small Modular Reactors, or SMRs, developed by California-based Kairos Power. Google said it expects to bring Kairos Power's first SMR online by the end of the decade, and others are planned through 2035. Through the deal, 500 megawatts (MW) of 24/7 carbon-free power will be available to U.S. electricity grids. Amazon (AMZN+0.38%) also signed agreements this week "to support the development of nuclear energy projects," including by building "several" SMRs which have "a smaller physical footprint, allowing them to be built closer to the grid," the company said. Compared to traditional reactors, the smaller SMRs can come online faster due to lower construction time, according to Amazon. In September, Constellation Energy (CEG-3.03%), which owns most of the country's power plants, announced a 20-year power purchase agreement with Microsoft. The deal will restart the Unit 1 reactor on Three Mile Island, and launch the Crane Clean Energy Center. The CCEC, which is expected to come online by 2028, will add more than 800 MW of carbon-free electricity to the power grid, a study by the Pennsylvania Building and Construction Trades Council found.
[3]
Plan for AI data center power usage or face the consequences, energy companies told
As US energy companies continue to grapple with the challenge of supplying enough power to meet the growing demand for AI data centers, a report from Bain & Company has revealed power use could soon exceed actual supply. The report forecasts that by 2028, utility companies will need to increase annual generation by as much as 26% in order to keep up with demand. The concerning outlook raises questions about the true eco-credentials of the time-saving and productivity-enhancing technology, which could soon need to rely on dirtier energy sources which are more abundant. Indicative of the scale of the problem, capital expenditure at these data centers is anticipated to rise nearly 30% this year alone. A separate study (via The Register) by Rystad Energy, a research and business intelligence company, found that US data center power consumption could more than double by the end of the decade. Typically, historical energy generation has been several hundred terawatt hours higher than historical energy consumption, however over the next four years, even the low-end demand scenario could outpace current generation, with the high-end demand scenario rising by as much as 1,000 TWh, from 4,000 to 5,000 TWh, marking a significant jump. By 2028, Bain & Company reckons that data centers will account for more than two-fifths (44%) of all US energy consumption, with residential applications standing at around one-quarter (27%). Manufacturing (17%) and commercial (13%) are set to take smaller percentages. Adding to the complexity, other sectors like electric vehicles and repatriated manufacturing amid geopolitical tensions are also driving up energy demand in coming years. Subsequently, the consulting firm warns that failing to act by modernizing business operations and infrastructure could cause companies to lose out on substantial revenues, even forcing datacenter companies to generate their own energy. And with US utility companies accustomed to flat or even shrinking demand, the fact that global data center energy demands could top $2 trillion represents a significant growth opportunity for those prepared to make the changes.
[4]
Microsoft Azure CTO claims distribution of AI training is needed as AI datacenters approach power grid limits
The rapid expansion of generative AI models require more powerful hardware and with the rise of datacenters with hundreds of thousands of AI GPUs, they are quickly pushing the limits of current datacenter infrastructure and soon they could hit the limit of power grid. While AWS, Microsoft, and Oracle plan to use nuclear power plants to power their datacenters, Microsoft Azure's chief technology officer, Mark Russinovich, suggests that connecting multiple datacenters may soon be necessary to train advanced AI models, reports Semafor. Modern AI datacenters, such as those built by Elon Musk's companies Tesla or xAI, can house 100,000 of Nvidia H100 or H200 GPUs and as American giants are competing to train the industry's best AI models, they are going to need even more AI processors that work in concert as a unified system. As a consequence, datacenters are becoming even more power hungry both due to the increased number of processors, higher power consumption of these processors, and the amount of power that is required for their cooling. As a result, datacenters consuming multiple gigawatts of power could soon become real. But the U.S. energy grid is already under strain, especially during periods of high demand, such as in hot summer days, there are concerns that the grid may not be able to keep up with the demand. To address these challenges, Microsoft is making significant investments in energy infrastructure. Recently the company signed a deal to reopen the Three Mile Island nuclear power plant to secure a more stable energy supply and before threat the company invested tens of billions of dollars in development of AI infrastructure development. But that may not be enough and at some point huge companies will have to connect multiple datacenters to train their most sophisticated models, says Microsoft Azure CTO. "I think it is inevitable, especially when you get to the kind of scale that these things are getting to," Russinovich told Semafor. "In some cases, that might be the only feasible way to train them is to go across datacenters, or even across regions. [...] I do not think we are too far away." On paper, this approach would address the growing strain on power grids and overcome technical challenges associated with centralized AI training. However, this strategy comes with major technical challenges, particularly in ensuring that datacenters remain synchronized and maintain high communication speeds required for effective AI training. The communication between thousands of AI processors within a single datacenter is already a challenge, and spreading this process across multiple sites only adds complexity. Advances in fiber optic technology have made long-distance data transmission faster, but managing this across multiple locations remains a significant hurdle. To mitigate these issues, Russinovich suggests that datacenters in a distributed system would need to be relatively close to one another. Also, implementing this multi-datacenter approach would require collaboration across multiple teams within Microsoft and its partner OpenAI, which means that decentralized AI training methods must be developed within Microsoft. There is a catch about decentralized AI training methods. Once developed, they offer a potential solution for reducing reliance on the most advanced GPUs and large-scale datacenters. This could lower the barrier to entry for smaller companies and individuals looking to train AI models without the need for massive computational resources. Interestingly, but Chinese researchers have already used decentralized methods to train their AI models across multiple datacenters. However, the details are scarce.
Share
Share
Copy Link
The rapid growth of AI is straining power grids and prolonging the use of coal-fired plants. Tech giants are exploring nuclear energy and distributed computing as potential solutions.
The rapid expansion of artificial intelligence (AI) is creating an unprecedented demand for energy, pushing data centers and power grids to their limits. As tech giants race to develop more advanced AI models, the energy consumption of data centers is skyrocketing, with potentially severe environmental consequences 1.
In a surprising turn of events, the surge in energy demand from AI data centers is prolonging the life of coal-fired power plants in the United States. In Omaha, plans to decommission coal-burning generators have been abandoned due to the need to serve nearby data centers, particularly those operated by Google and Meta 1.
The energy consumption of data centers is expected to grow dramatically in the coming years:
To address the growing energy crisis, major tech companies are turning to nuclear power:
Microsoft Azure's CTO, Mark Russinovich, suggests that connecting multiple data centers may soon be necessary to train advanced AI models:
The AI boom is raising serious environmental concerns:
As the AI industry continues to grow, balancing technological advancement with environmental responsibility remains a critical challenge for tech companies and policymakers alike.
Reference
[1]
The rapid growth of artificial intelligence is causing a surge in energy consumption by data centers, challenging sustainability goals and straining power grids. This trend is raising concerns about the environmental impact of AI and the tech industry's ability to balance innovation with eco-friendly practices.
8 Sources
8 Sources
A new study reveals that AI data centers in the US have tripled their carbon emissions since 2018, now rivaling the commercial airline industry. This surge is attributed to the AI boom and raises concerns about the environmental impact of AI technologies.
2 Sources
2 Sources
As artificial intelligence continues to advance, concerns grow about its energy consumption and environmental impact. This story explores the challenges and potential solutions in managing AI's carbon footprint.
5 Sources
5 Sources
The rapid advancement of artificial intelligence is driving unprecedented electricity demands, raising concerns about sustainability and the need for innovative solutions in the tech industry.
4 Sources
4 Sources
Gartner forecasts that 40% of AI data centers will face operational constraints due to power shortages by 2027, as the rapid growth of AI and generative AI drives unprecedented increases in electricity consumption.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved