3 Sources
3 Sources
[1]
The AI-energy apocalypse might be a little overblown
Even if AI turns out not to be as much of an energy hog as people are making it out to be, it could still spell out trouble for power grids across the US. Tech companies are already burning through increasing amounts of electricity to train and run new AI models. And they're asking for a lot more electricity as they try to outcompete each other. That rising demand is already starting to reshape the energy system, with utilities scrambling to build out new gas plants and pipelines. But all these plans to reshape the US energy system could be based on an AI bubble. With overexcited investors pumping money into tech companies afraid of missing the bandwagon but still at risk of developing AI tools that ultimately flop, utilities are also faced with a wave of speculation over data centers' energy needs. The uncertainty is unnerving considering the costs that Americans could wind up paying when it comes to higher utility bills and more pollution, a recent report warns. A transition to cleaner and more affordable energy sources has been making progress slowly in the US. That's in peril unless tech companies and utilities demand more transparency and opt for more renewables like solar and wind energy. "While the AI boom provides exciting opportunities, there are many risks to not approaching energy needs with a deliberate and informed response that takes long term impacts into account," Kelly Poole, lead author of the report published this month by shareholder advocacy group As You Sow and environmental organization Sierra Club, said in a briefing with reporters. The nation's fleet of gas-fired power plants would grow by nearly a third if all of the new gas projects proposed between January 2023 and January 2025, as the generative AI industry heated up, come to fruition. The amount of new gas capacity that utilities and independent developers proposed jumped by 70 percent during that time frame, driven in large part by rising data center electricity demand. Prior to the generative AI boom, electricity demand had pretty much flatlined for more than a decade with energy efficiency gains. But new data centers, souped-up for AI, are a lot more energy-hungry than they have been in the past. A rack of computers in a traditional data center might use 6-8 kilowatts of power -- roughly equivalent to the power used by three homes in the US, Dan Thompson, a principal research analyst at S&P Global, explained in the briefing. AI, however, requires more powerful computer chips to run more complicated tasks. The power required to run one of those high-density racks equals about 80 to 100 homes' worth of power, or upward of 100 kilowatts, according to Thompson. "Essentially what you're looking at is a small town's worth of power being deployed," he said. Why does that matter? Power grids basically function as a precarious balancing act. If power supply can't meet demand growth, it could lead to higher utility bills and potential outages. On the other hand, overbuilding new capacity risks creating stranded assets that utilities and their customers wind up paying for regardless of whether or not they actually need them in the long term. That's why it's so important to try to get an accurate forecast of future demand. And while AI does use a lot of energy, projections for the future get murky. "Speculators are flooding the market," the report says, seeking to build and flip data centers. Trying to get ahead of long wait times to connect to the power grid, some of those speculators are requesting power even before they've got the capital or customers lined up to ensure they can bring a project to the finish line. There could also be some double or triple counting (or more) going on when it comes to forecasting AI energy demand because of developers approaching more than one utility to get several quotes. In the Southeast, a major hub for data centers, utilities are projecting as much as four times more demand growth compared to independent analyses of industry trends, according to a report earlier this year from the Institute for Energy Economics and Financial Analysis (IEEFA). Nationally, utilities are preparing for 50 percent more demand growth than the tech industry is expecting, a separate report from December 2024 states. Utilities themselves have recognized this risk on recent earnings calls. Proposed projects trying to connect to the grid "may be overstated anywhere from three to five times what might actually materialize," Jim Burke, CEO of Texas-based Vistra Energy, said in a Q1 earnings call this year. Despite the uncertainty, they're still building out new gas power plants and pipelines to meet that demand. After all, building new infrastructure is one of the most lucrative ways for a utility to increase profits. And right now, the Trump administration -- whose campaign was buoyed by oil and gas contributions -- is incentivizing reliance on fossil fuels. In Louisiana, for example, local utility Entergy proposed building three new gas plants to power a giant new Meta data center. The data center is estimated consume as much electricity as 1.5 million homes and lead to 100 million tons of carbon emissions over 15 years. It's a stark contrast from the Biden administration's goal of getting the power grid to run on 100 percent carbon pollution-free energy by 2035. The only way to stop climate change in its tracks is to get rid of planet-heating pollution from fossil fuels. Building a rush of new gas infrastructure obviously moves the nation in the opposite direction. There are solutions to minimize all these risks, As You Sow and Sierra Club point out in their report. Utilities can require developers to disclose the number of other utilities they've brought their data center proposal to and how far along they are in finalizing a project. When inking contracts, they can also require long-term service agreements, hike-up nonrefundable deposits, and raise fees for canceling a project. Tech companies clearly have a big role to play, too, by improving the energy efficiency of their technologies and investing in renewables. For years, tech giants including Amazon, Meta, and Google have been top corporate purchasers of renewable energy. Inking those kinds of long-term agreements to build out new solar and wind farms can have even more impact now, counteracting the Trump administration's rollback of financial incentives for renewables, if companies are willing to prioritize their own sustainability goals as much as their AI ambitions.
[2]
Will OpenAI Really Build 60 Football Fields Worth of AI Infrastructure Per Week?
Emily is an experienced reporter who covers cutting-edge tech, from AI and EVs to brain implants. She stays grounded by hiking and playing guitar. OpenAI CEO Sam Altman has a lofty new vision to "create a factory that can produce a gigawatt of new AI infrastructure every week." His Tuesday blog post is light on details, but it's safe to say he's talking about finding ways to satisfy the company's never-ending need for more computing power for its latest products. On the one hand, by using the word "factory," Altman may be suggesting he wants to create a slick manufacturing facility where robots assemble ultra-powerful GPUs day and night. OpenAI announced a partnership with Nvidia this week, which the companies are calling the "biggest AI infrastructure deployment in history." Nvidia will provide "millions" of GPUs to help "scale OpenAI's compute with multi-gigawatt data centers." On the other hand, it's also possible that Altman's factory might just be a data center, perhaps many of them. Nvidia CEO Jensen Huang often refers to data centers as AI factories, portraying them as the engine behind the next wave of industrialization. But in reality, they are mostly glorified equipment warehouses. Altman admits that producing a gigawatt of new AI infrastructure every week is "extremely difficult [and will] take us years" to figure out. Financing is another hurdle to overcome. But for the sake of argument, let's say Altman could achieve what he's laid out. What would it look like to physically manifest a gigawatt of computing power every week? How much space would that take up? 60 Football Fields? The Math Is Not Looking Good The newest, most advanced data centers are massive. An Amazon data center in Indiana is 1,200 acres, The New York Times reports, enough to fit 10 Malls of America. Meta CEO Mark Zuckerberg wants to build data centers large enough to cover most of Manhattan. To estimate how much land OpenAI would need to build a one-gigawatt facility with today's technology, we can look at the acreage of its Texas data center, which is currently under development as part of the Stargate project. According to developer Crusoe, that project is "approximately 4 million square feet, and a total power capacity of 1.2 gigawatts (GW)." It will consume as much electricity as an entire city on its own. If OpenAI is getting 1.2 gigawatts out of 4 million square feet, that means to build one gigawatt, it currently needs 3.33 million square feet. That's the equivalent of about 60 football fields. Every football field needs extra room around it for a snack stand and bleachers. Data centers will also need some extra padding around the edge for things like parking or equipment storage (probably not for buying hot dogs). Altman says the extra compute power is necessary to make "amazing things" possible, such as curing cancer or figuring out "how to provide customized tutoring to every student on Earth." If computing power is limited, "we'll have to choose which one to prioritize; no one wants to make that choice, so let's go build." Yet OpenAI published a study last week that found 73% of ChatGPT conversations are not about work, curing cancer, or education. They are mostly people seeking help with decisions in their personal lives. More Land Than You Can Imagine, But Very Few Jobs Given their size -- and horizontal versus vertical design -- the facilities are often built in rural areas with vast expanses of open land. But while major companies moving into a town often translates into new jobs, data centers don't need hundreds of humans to man a production line. Amazon's $20 billion data center investment in Northeast Pennsylvania, for example, will employ 1,250 people across dozens of facilities. By comparison, Ford's $5 billion EV factory announced in August will employ 4,000 between two plants in Kentucky and Michigan. That's not to say that progress is not a noble goal. But the reality is that data centers are inefficient, guzzling water and consuming huge amounts of electricity. Local communities across the US are fighting back, as reported by many publications, including Futurism, NPR, CNET, and the Washington Post. There is no shortage of horror stories, like a 24/7 hum coming from the buildings, keeping Virginia residents up at night, Business Insider reports, or when kitchen taps ran dry in Georgia after Meta built a facility nearby, The New York Times reports. Since the grid does not have enough power, some areas are seeing enormous spikes in utility bills, up to 20% in the Northeast, as residents compete with data centers. That might be why Altman refers to quickening "new energy production" in his blog post as well. For Big Tech, it's all worth it. Data centers are giant moneymakers, or "the literal key to increasing revenue," as Altman puts it. That's why there is no shortage of new buildouts announced every week. The Trump administration, meanwhile, is working on slashing red tape to help tech companies build data centers faster.
[3]
Nvidia's and OpenAI's 'monumental' data center plan has an equally massive problem: Where to find the power
Nvidia CEO Jensen Huang told CNBC this week that the chipmaker's AI infrastructure plan with OpenAI is "monumental in size." Their plan is so big that it will push the boundaries of what is possible. The chipmaker and the AI lab are aiming to build at least 10 gigawatts of data centers. This will sap a massive amount of power at a time when the electric grid is already strained . Attempts to deploy more power have faced economic and political constraints that make a fast fix unlikely. Ten gigawatts is roughly equivalent to the annual power consumption of 8 million U.S. households, according to a CNBC analysis of data from the Energy Information Administration. It is about the same amount of power as New York City's baseline peak summer demand in 2024, according to the New York Independent System Operator , the state electric grid. "There's never been an engineering project, a technical project of this complexity and this scale -- ever," Huang told CNBC on Monday. Nvidia and OpenAI have provided no information on when and where the sites will be built, other than disclosing that the first gigawatt will come online in the second half of 2026. When CNBC reached out for more detail on Tuesday, Nvidia declined to comment. It's unclear where all the electricity that the companies need will come from. The U.S. is forecast to add 63 gigawatts of power to the grid this year, according to EIA data . Nivida's and OpenAI's 10 gigawatts of data centers are equivalent to a big chunk, 16%, of the new power that will be deployed in 2025. The Trump administration is pushing for data centers to use fossil fuels, particularly natural gas, but orders for new gas turbines face long wait times with GE Vernova sold out through 2028. The U.S. is forecast to add just 4.4 gigawatts of new gas generation this year, according to EIA. The tech sector and the White House are working to build new nuclear plants, but it will take years for reactors to connect to the grid. The recent big expansion at Plant Vogtle in Georgia took more than a decade to complete. And the small advanced reactors backed by the tech sector are not expected to reach a commercial stage until the end of the decade at earliest. This leaves renewable power as the most viable, quickly deployable source of electricity to meet the demand from Nvidia and OpenAI in the near term. More than 90% of the new power that the U.S. is expected to add this year will come from solar, wind or battery storage, according to EIA. "The power requirement is largely going to be coming from the new energy sector or not at all," said Kevin Smith, CEO of Arevon, a solar and battery storage developer headquartered in Scottsdale, Arizona, that's active in 17 states. But the White House has effectively declared war on renewable power. President Donald Trump said last month that the federal government will not approve any more solar and wind . Interior Secretary Doug Burgum's office is now reviewing all permits for solar and wind projects. Even projects on private land could be hampered by the Trump administration as such efforts often need permits from federal agencies like the U.S. Fish and Wildlife Service. Trump's tariffs, uncertainty over permitting, and the end of key tax credits will lead to a slowdown in renewable deployment in the coming years that could challenge data center deployment, Smith and executives at other big renewable developers warned CNBC last month . "The panic in the data center, AI world is probably not going to set in for another 12 months or so, when they start realizing that they can't get the power they need in some of these areas where they're planning to build data centers," Smith told CNBC in August. "Then we'll see what happens," Smith said. "There may be a reversal in policy to try and build whatever we can and get power onto the grid."
Share
Share
Copy Link
As AI giants plan massive expansions in computing power, concerns arise about energy demand, infrastructure readiness, and environmental impact. The industry faces challenges in powering its ambitious projects sustainably.
As artificial intelligence (AI) continues to revolutionize various sectors, a new challenge emerges: the massive energy demand required to power AI infrastructure. Tech giants like OpenAI and Nvidia are planning unprecedented expansions in computing power, but questions arise about the feasibility and environmental impact of these ambitious projects.
OpenAI CEO Sam Altman recently unveiled a bold plan to "create a factory that can produce a gigawatt of new AI infrastructure every week"
2
. This vision, while ambitious, raises concerns about the practicality and environmental implications of such rapid expansion. To put this into perspective, a gigawatt of computing power would require approximately 3.33 million square feet of space – equivalent to about 60 football fields2
.Source: The Verge
Nvidia CEO Jensen Huang described their collaboration with OpenAI as "monumental in size," aiming to build at least 10 gigawatts of data centers
3
. This massive undertaking would consume power equivalent to the annual electricity usage of 8 million U.S. households or New York City's baseline peak summer demand3
.Source: CNBC
The rapid expansion of AI infrastructure poses significant challenges for the U.S. energy grid. While the country is forecast to add 63 gigawatts of power to the grid this year, Nvidia and OpenAI's plan alone would require 16% of this new capacity
3
. This demand comes at a time when the electric grid is already strained, and attempts to deploy more power face economic and political constraints.Renewable energy sources appear to be the most viable option for meeting the short-term demand of AI infrastructure. More than 90% of new power additions in the U.S. are expected to come from solar, wind, or battery storage
3
. However, recent policy changes and uncertainties surrounding permitting and tax credits could slow down renewable deployment, potentially challenging data center expansion plans.Related Stories
Some experts warn that the current AI energy demand projections might be inflated. Utilities are preparing for 50% more demand growth than the tech industry is expecting, according to a recent report
1
. This uncertainty could lead to overbuilding of energy infrastructure, resulting in stranded assets and increased costs for consumers.The rapid expansion of data centers is not without controversy. Local communities across the U.S. are pushing back against these developments due to concerns about water usage, noise pollution, and strain on local resources
2
. Some areas are seeing utility bill increases of up to 20% as residents compete with data centers for power2
.Source: PC Magazine
As the AI industry continues to grow, finding a balance between technological innovation and sustainable energy use becomes crucial. Tech companies and utilities must work together to ensure accurate demand forecasting, promote transparency, and prioritize renewable energy sources. The future of AI development may well depend on our ability to power it responsibly and efficiently.
Summarized by
Navi
[1]