3 Sources
3 Sources
[1]
Your AI tools run on fracked gas and bulldozed Texas land | TechCrunch
The AI era is giving fracking a second act, a surprising twist for an industry that, even during its early 2010s boom years, was blamed by climate advocates for poisoned water tables, man-made earthquakes, and the stubborn persistence of fossil fuels. AI companies are building massive data centers near major gas-production sites, often generating their own power by tapping directly into fossil fuels. It's a trend that's been overshadowed by headlines about the intersection of AI and healthcare (and solving climate change), but it's one that could reshape -- and raise difficult questions for -- the communities that host these facilities. Take the latest example. This week, the Wall Street Journal reported that AI coding assistant startup Poolside is constructing a data center complex on more than 500 acres in West Texas -- about 300 miles west of Dallas -- a footprint two-thirds the size of Central Park. The facility will generate its own power by tapping natural gas from the Permian Basin, the nation's most productive oil and gas field, where hydraulic fracturing isn't just common but really the only game in town. The project, dubbed Horizon, will produce two gigawatts of computing power. That's equivalent to the Hoover Dam's entire electric capacity, except instead of harnessing the Colorado River, it's burning fracked gas. Poolside is developing the facility with CoreWeave, a cloud computing company that rents out access to Nvidia AI chips and that's supplying access to more than 40,000 of them. The Journal calls it an "energy Wild West," which seems apt. Yet Poolside is far from alone. Nearly all the major AI players are pursuing similar strategies. Last month, OpenAI CEO Sam Altman toured his company's flagship Stargate data center in Abilene, Texas -- around 200 miles from the Permian Basin -- where he was candid, saying, "We're burning gas to run this data center." The complex requires about 900 megawatts of electricity across eight buildings and includes a new gas-fired power plant using turbines similar to those that power warships, according to the Associated Press. The companies say the plant provides only backup power, with most electricity coming from the local grid. That grid, for the record, draws from a mix of natural gas and the sprawling wind and solar farms in West Texas. But the people living near these projects aren't exactly comforted. Arlene Mendler lives across the street from Stargate. She told the AP she wishes someone had asked her opinion before bulldozers eliminated a huge tract of mesquite shrubland to make room for what's being built atop it. "It has completely changed the way we were living," Mendler told the AP. She moved to the area 33 years ago seeking "peace, quiet, tranquility." Now construction is the soundtrack in the background, and bright lights on the scene have spoiled her nighttime views. Then there's the water. In drought-prone West Texas, locals are particularly nervous about how new data centers will impact the water supply. The city's reservoirs were at roughly half-capacity during Altman's visit, with residents on a twice-weekly outdoor watering schedule. Oracle claims each of the eight buildings will need just 12,000 gallons per year after an initial million-gallon fill for closed-loop cooling systems. But Shaolei Ren, a University of California, Riverside professor who studies AI's environmental footprint, told the AP that's misleading. These systems require more electricity, which means more indirect water consumption at the power plants generating that electricity. Meta is pursuing a similar strategy. In Richland Parish, the poorest region of Louisiana, the company plans to build a $10 billion data center the size of 1,700 football fields that will require two gigawatts of power for computation alone. Utility company Entergy will spend $3.2 billion to build three large natural-gas power plants with 2.3 gigawatts of capacity to feed the facility by burning gas extracted through fracking in the nearby Haynesville Shale. Louisiana residents, like those in Abilene, aren't thrilled to be encircled by bulldozers around the clock. (Meta is also building in Texas, though elsewhere in the state. This week the company announced a $1.5 billion data center in El Paso, near the New Mexico border, with one gigawatt of capacity expected online in 2028. El Paso isn't near the Permian Basin, and Meta says the facility will be matched with 100% clean and renewable energy. One point for Meta.) Even Elon Musk's xAI, whose Memphis facility has generated considerable controversy this year, has fracking connections. Memphis Light, Gas and Water - which currently sells power to xAI but will eventually own the substations xAI is building - purchases natural gas on the spot market and pipes it to Memphis via two companies: Texas Gas Transmission Corp. and Trunkline Gas Company. Texas Gas Transmission is a bidirectional pipeline carrying natural gas from Gulf Coast supply areas and several major hydraulically fractured shale formations through Arkansas, Mississippi, Kentucky, and Tennessee. Trunkline Gas Company, the other Memphis supplier, also carries natural gas from fracked sources. If you're wondering why AI companies are pursuing this path, they'll tell you it's not just about electricity; it's also about beating China. That was the argument Chris Lehane made last week. Lehane, a veteran political operative who joined OpenAI as vice president of global affairs in 2024, laid out the case during an on-stage interview with TechCrunch. "We believe that in the not-too-distant future, at least in the U.S., and really around the world, we are going to need to be generating in the neighborhood of a gigawatt of energy a week," Lehane said. He pointed to China's massive energy buildout: 450 gigawatts and 33 nuclear facilities constructed in the last year alone. When TechCrunch asked about Stargate's decision to build in economically challenged areas like Abilene, or Lordstown, Ohio, where more gas-powered plants are planned, Lehane returned to geopolitics. "If we [as a country] do this right, you have an opportunity to re-industrialize countries, bring manufacturing back and also transition our energy systems so that we do the modernization that needs to take place." The Trump administration is certainly on board. The July 2025 executive order fast-tracks gas-powered AI data centers by streamlining environmental permits, offering financial incentives, and opening federal lands for projects using natural gas, coal, or nuclear power -- while explicitly excluding renewables from support. For now, most AI users remain largely unaware of the carbon footprint behind their dazzling new toys and work tools. They're more focused on capabilities like Sora 2 - OpenAI's hyperrealistic video-generation product that requires exponentially more energy than a simple chatbot - than on where the electricity comes from. The companies are counting on this. They've positioned natural gas as the pragmatic, inevitable answer to AI's exploding power demands. But the speed and scale of this fossil fuel buildout deserves more attention than it's getting. If this is a bubble, it won't be pretty. The AI sector has become a circular firing squad of dependencies: OpenAI needs Microsoft needs Nvidia needs Broadcom needs Oracle needs data center operators who need OpenAI. They're all buying from and selling to each other in a self-reinforcing loop. The Financial Times noted this week if the foundation cracks, there'll be a lot of expensive infrastructure left standing around, both the digital and the gas-burning kind. OpenAI's ability alone to meet its obligations is "increasingly a concern for the wider economy," the outlet wrote. One key question that's been largely absent from the conversation is whether all this new capacity is even necessary. A Duke University study found that utilities typically use only 53% of their available capacity throughout the year. That suggests significant room to accommodate new demand without constructing new power plants, as MIT Technology Review reported earlier this year. The Duke researchers estimate that if data centers reduced electricity consumption by roughly half for just a few hours during annual peak demand periods, utilities could handle an additional 76 gigawatts of new load. That would effectively absorb the 65 gigawatts data centers are projected to need by 2029. That kind of flexibility would allow companies to launch AI data centers faster. More importantly, it could provide a reprieve from the rush to build natural gas infrastructure, giving utilities time to develop cleaner alternatives. But again, that would mean losing ground to an autocratic regime, per Lehane and many others in the industry, so instead, the natural gas building spree appears likely to saddle regions with more fossil-fuel plants and leave residents with soaring electricity bills to finance today's investments, including long after the tech companies' contracts expire. Meta, for instance, has guaranteed it will cover Entergy's costs for the new Louisiana generation for 15 years. Poolside's lease with CoreWeave runs for 15 years. What happens to customers when those contracts end remains an open question. Things may eventually change. A lot of private money is being funneled into small modular reactors and solar installations with the expectation that these cleaner energy alternatives will become more central energy sources for these data centers. Fusion startups like Helion and Commonwealth Fusion Systems have similarly raised substantial funding from those the front lines of AI, including Nvidia and Altman. This optimism isn't confined to private investment circles. The excitement has spilled over into public markets, where several "non-revenue-generating" energy companies that have managed to go public have truly anticipatory, market caps, based on the expectation that they will one day fuel these data centers. In the meantime -- which could still be decades -- the most pressing concern is that the people who'll be left holding the bag, financially and environmentally, never asked for any of this in the first place.
[2]
Massive AI data center buildouts are squeezing energy supplies -- New energy methods are being explored as power demands are set to skyrocket
A perfect storm of massive data centre buildout and new innovations in energy could materially change what our supplies look like in the future. The world is building the electricity system for artificial intelligence on the fly. Over the next five years, the power appetite of data-center campuses built for training and serving large AI models will collide with the reality of permitting, transmission backlogs, and siting constraints. All that could materially change how much, from where, and what type of energy supplies we obtain. The International Energy Agency (IEA) projects global data centre electricity demand will more than double to 945 terawatt-hours by 2030, with AI being the largest single driver of the rise. European energy demand alone could jump from about 96 TWh in 2024 to 168 TWh by 2030. The enormous rise in energy demand didn't just start with the November 2022 release of ChatGPT. The electricity use of hyperscalers has been rising more than 25% for seven years running, according to analysis by Barclays. But the amount of energy needed for AI inference and training has seen those already prodigious rises increase even further. "AI has been rolled out everywhere," Chris Preist, professor of sustainability and computer systems at the University of Bristol, said in an interview with Tom's Hardware Premium. "'Everything everywhere all at once' is the phrase I like to use for AI," Preist said. "It's doing what technologies have always done, but it's doing it at a far, far higher speed." People like Sam Altman are enormously bullish on the future of AI, but are hyperconscious of the energy crunch that it creates. In an essay published in late September, Altman wrote of his vision "to create a factory that can produce a gigawatt of new AI infrastructure every week." He added: "The execution of this will be extremely difficult; it will take us years to get to this milestone and it will require innovation at every level of the stack, from chips to power to building to robotics." As Altman says, everything affects our energy systems. The IEA's forecasts have proven to be a wake-up call for the energy sector and the AI industry. Anthropic has stated that it believes it'll need 2GW and 5GW data centres as standard to develop its advanced AI models in 2027 and 2028. It also forecasts that the total frontier AI training demand in the United States will reach up to 25GW by 2028. That's just for training: inference will add the same amount to that, Anthropic forecasts, meaning the US alone will need 50GW of AI capacity by 2028 to retain its world-leading position. Already, the data centre sector is responding by building out huge numbers of new projects: spending on U.S. data centre construction grew 59% in 2023 to $20 billion, and to $31 billion - another 56% leap - in 2024. In 2021, pre-ChatGPT, annual private data construction was only around $10 billion. All those data centres need a reliable supply to power them, so companies are starting to consider how to mitigate an energy crunch that strains grids to their breaking point by building lots of new generation capacity. But questions remain about whether it's needed. Preist is at pains to point out that globally, we have a surfeit of energy supply. "In the case of digital tech, it's actually a local shortfall, not a global shortfall," he said. But in areas where AI demand is high, there's often a gap between the demand to power the AI systems being used and the supply for them. Having access to reliable power sources is a key consideration for those building data centres, with 84% of data centre leaders telling Bloom Energy in an April 2025 survey that it was their top consideration in where they choose to build - more than twice as important as proximity to end users, or the level and type of local regulations they would face. What type of energy supplies are being built is also changing. A radical shift is taking place with AI companies moving towards firm clean power underpinned by nuclear and geothermal. Constellation is fast-tracking a restart of the shuttered Three Mile Island Unit 1 (renamed the Crane Clean Energy Centre) after striking a 20-year power purchase agreement (PPA) with Microsoft, which, if it meets its ambitious target of a 2027 restart date, could be the first full restart of a U.S. nuclear plant. Analysts estimate Microsoft is paying a premium for certainty of supply. Amazon has also funded a $500 million raise for X-energy and set a target to deploy more than 5GW of small modular reactors across the US by 2039, pairing equity stakes with future PPAs. Both deals are designed to bankroll reliable low-carbon supplies for their AI campuses. The shift isn't just in what's built, but how it's supplied. Rather than annual renewable matching, buyers are signing hour-by-hour, location-specific carbon-free supply and paying for storage to firm it. Data centers are being placed in areas with low-carbon energy supplies and faster planning processes. But the hard limit remains the wires. Even where generation exists, securing a new high-voltage tie-in can take years, so AI data campuses are planning for staged rollout, and until they are fully complete, temporary solutions to build out capacity. AI firms, such as xAI, are spending big to try to install energy supplies as quickly as they can. However, Izzy Woolgar, director of external affairs at the Centre for Net Zero, said in an interview with Tom's Hardware Premium that the purported surfeit in supply might not be as great as initially thought. "We know data centres and electricity are driving up demand today," she said. 'This creates two challenges: first to accurately forecast the energy required to power those centres, and secondly, to meet that demand as quickly and cleanly as possible in the context of an already strained grid." Choosing clean energy options is tricky, partly because alternative power sources aren't always available, and energy demand must be met immediately to support the massive needs of AI. Developers can sign record renewable deals, but connecting new supply remains the hard limit. In the United States, the median time from requesting a grid interconnection to switching on stretched to nearly five years for projects completed in 2022-23 - up from three years in 2015 and less than two years in 2008. That delay is system-wide and rising across regions. "The quickest way of getting around that is to install energy sources at the data centre itself," said Preist. "Ideally, those would be renewable, but often the quickest way of getting it in places is mobile gas generation." It means that we're seeing increased demand for all types of energy, including fossil fuels. Those constraints on the grid and supplies more generally are why tech companies are spending billions to invest in small modular reactors and other supply sources. "We are rapidly building out our infrastructure as part of the energy transition, but confronted by a congested grid, data centre operators are exploring bypassing delays and seeking out direct connections to generators," Woolgar said. She pointed out that companies have explored hook-ups to gas-fired power plants, and many are investing heavily in unproven technologies such as nuclear small modular reactors (SMRs), which are already being invested in and showcased by the likes of Amazon. But many SMRs need to clear hurdles of red tape, which won't make it feasible for all until the end of the decade. And that's the issue, said Woolgar. "The long development times of these major projects will not meet the immediate demands of AI, and we are overlooking the ability of proven and reliable clean technology like solar, wind, and battery storage that can be deployed in as little as two to three years," she said. As well as things like small modular reactors, other alternatives could be pursued. One of those areas is renewable microgrids, which the Centre for Net Zero's modelling indicates could cost 43% less to run than nuclear alternatives. That can help the world get power where it is needed in the short term, Woolgar explained. There are also environmental considerations that are resulting in an energy crunch. Water for cooling is becoming a terminal issue in many regions. In areas of high power density, there's a consideration towards direct liquid and immersion cooling to try and slash water demand and reduce fan-based cooling. Given all the debate around whether we're currently in an AI bubble, the level of energy supply expansion has some worried. "Predicted future surges in demand from AI could well be overplayed, as current projections often fail to account for emerging or future breakthroughs," said Woolgar. Preist is also worried about whether we're about to spend vast sums to build energy supplies that won't be needed if the visions of the future powered by AI turn out to be more science fiction than fact. "There is a risk that energy companies overprovision for a bubble which then bursts, and then you're stuck," he said. "You're stuck with a load of infrastructure, gas power stations, et cetera, which the residential people need to effectively pay for through overpriced energy." Woolgar explained that she believed the current plans for building out energy infrastructure to account for future AI demand didn't account for improvements in technology efficiency. "The upfront energy costs of training models tend to appear quickly, while the downstream benefits and advancements are less immediate and certain," she said. For that reason, it was likely that the huge buildout won't need to be quite as large as expected. "The requirements to train models today won't remain constant," said Woolgar. She points out that DeepSeek's R1 model, launched earlier this year, was trained using just 10% of Meta's Llama computational resources, "and chips will inevitably get more efficient when it comes to energy consumption." There's also the idea that, as well as becoming more efficient, infrastructure providers can become more intelligent about recycling waste outputs, such as heat, from the massive data centres that will have to be built to meet society's insatiable AI demand. "There is significant heat, and ideally it would be put into district heating systems," said Preist, pointing to the design of Isambard AI - a small data centre when looked at from future scaling standards, but quite large by traditional standards. That has been "designed to allow the heat to be reused in a district heating system", even though the infrastructure to connect it up to a district heating system hasn't yet been built. Some work is going on in this area already, in large part thanks to the collateral benefits of the adoption of advanced cooling techniques such as immersion cooling to try and reduce the need for energy-intensive traditional cooling in data centres. The power density of next-generation GPUs like Rubin and Rubin Ultra, which is projected to require between 1.8kW and 3.6kW per module, with entire racks approaching 600 kW, means that air cooling isn't a practical solution for these deployments. In their place, immersion cooling, where servers are submerged in dielectric fluids, has been proposed as an alternative. That recovered heat from immersion cooling could be redirected to use in nearby residential energy grids or industrial facilities, experts reckon. It all shows how quickly the face of energy is changing as AI stretches, then changes, supplies. And with multiple companies racing to try and compete with one another to gain an edge, build up their customer base and keep a foothold in the market, the demand for more varied sources of supply seems like it's going to continue to rise in the years to come.
[3]
Britains's AI gold rush hits a wall: not enough electricity
Energy Secretary Miliband promises renewable utopia for green and pleasant land... filled with datacenters Energy is essential for delivering the UK governments' AI ambitions, but Britain faces a critical question: how can it supply enough power for rapidly expanding datacenters without causing blackouts or inflating consumer bills? At Energy UK's annual London conference this week, Energy Secretary Ed Miliband made his position clear: renewables are the future, and fossil fuels drive both climate damage and the nation's inflated energy costs. What's less clear is how to reach this renewable utopia given decades of infrastructure underinvestment. "Building clean energy is the right choice for the country because, despite the challenges, it is the only route to a system that can reliably bring down bills for good, and give us clean energy abundance," Miliband claims. The UK reportedly has the world's most expensive electricity, largely because wholesale electricity prices track gas prices, which surged after Russia's invasion of Ukraine. Gas-fired generators serve as the backup when solar and wind fall short -- a frequent occurrence on Britain's gray, windless days. The obvious solution -- more solar farms and wind turbines -- faces significant obstacles. Offshore wind farms can take years to construct, while onshore projects, though faster to build, face lengthy land acquisition and planning permission processes. Local opposition to solar farms is fierce, with many viewing them as blights on the landscape, particularly when built on farmland. The government's answer is to streamline the planning process by amending the National Planning Policy Framework or designating projects as critical national infrastructure, as it did with datacenters. But even with expedited approvals, new power projects must keep pace with datacenter construction. Several major datacenters have broken ground near London's M25 in the past year alone, including Europe's largest planned cloud and AI datacenter near South Mimms, a Google facility at Waltham Cross, and projects at Abbots Langley, East Havering, and one at Woodlands Park, near Iver in Buckinghamshire - previously been rejected but now listed as approved on the developers' website.. The government's "AI Growth Zones," targeted at sites with existing grid connections like decommissioned power stations, hint at recognition of the scale challenge. And the energy regulator, Ofgem, gave the go-ahead in July to an investment program that will deliver £15 billion ($20 billion) for gas transmission and distribution networks and £8.9 billion ($12 billion) for what is being touted as "the biggest expansion of the electricity grid since the 1960s." In July, energy regulator Ofgem approved a £23.9 billion ($32 billion) investment program -- £15 billion for gas networks and £8.9 billion for what's being called the biggest electricity grid expansion since the 1960s. However, The Guardian noted that householders will fund this through higher charges, with bills rising by £104 ($140) by 2031 -- on top of already inflated costs. Replacing gas-fired generators as the reliable baseload source remains problematic. Unlike the US, Britain can't return to coal -- its last coal-fired power station closed last year. Battery energy storage systems (BESSs) offer one solution for storing excess renewable energy. But the gap is enormous: according to RenewableUK, Britain had 5,013 MW of operational battery storage at year-end, while peak demand on a cold day reaches 61.1 GW. Nuclear power is the elephant in the room. And as Miliband states, much of the UK's nuclear fleet dates to the 1980s, and no new station has come online since Sizewell B 30 years ago. Construction of a new reactor, Hinkley Point C, began in December 2018, with expected completion by 2027, but EDF, the company building it, now says it is unlikely to be operational before 2030. The overall cost has ballooned from £26 billion ($35 billion) to between £31 and £34 billion ($42 to $47 billion). Small modular reactors (SMRs) are gaining attention from government and datacenter operators, but the technology remains largely untested, and unlikely to be ready for another decade. "Omdia has talked with many power generation project developers and the consensus is that broad market acceptance and availability is likely around 2035, so about 10 years out," Omdia principal analyst Alan Howard, told us earlier this year. A recent study also found that renewable energy sources can provide power for datacenters more cheaply than SMRs, when paired with battery storage. However, the datacenters are being built now, and the government also expects Brits to switch to electric vehicles and give up gas-fired central heating for alternatives such as electric-powered heat pumps. "In the years ahead, we expect a massive increase in electricity demand - around 50 percent by 2035 and a more than doubling by 2050," Miliband said in his speech, calling it "a massive opportunity for us." "We want as a country to seize the opportunities of electric vehicles that are cheaper to run, new industries such as AI, and the benefits of electrification across the economy," he added. Whatever Miliband plans to ensure there is sufficient energy for all this, he needs to act fast, or his government's ambitions to pepper the country with AI datacenters are going to be thwarted by lack of power, soaringe energy costs - or both. ®
Share
Share
Copy Link
The rapid expansion of AI is causing a surge in energy demand, reviving fracking and straining power grids worldwide. This trend raises environmental concerns and challenges infrastructure readiness, prompting a search for sustainable energy solutions.
The rapid expansion of artificial intelligence (AI) is reshaping the energy landscape, with data centers becoming increasingly power-hungry. This surge in energy demand is raising concerns about environmental impact, infrastructure readiness, and the future of power generation.
Source: The Register
AI companies are constructing massive data centers that require unprecedented amounts of electricity. OpenAI's Stargate facility in Abilene, Texas, demands about 900 megawatts across eight buildings
1
. Similarly, Poolside's Horizon project in West Texas will produce two gigawatts of computing power, equivalent to the Hoover Dam's entire electric capacity1
.Source: Tom's Hardware
The International Energy Agency (IEA) projects that global data center electricity demand will more than double to 945 terawatt-hours by 2030, with AI being the largest single driver of this increase
2
.Surprisingly, the AI boom is giving fracking a second wind. Many AI companies are building data centers near major gas-production sites, often generating their own power by tapping directly into fossil fuels
1
. This trend raises concerns about water table contamination, man-made earthquakes, and the continued reliance on fossil fuels.The rapid construction of these energy-intensive facilities is straining local infrastructure and communities. In Abilene, Texas, residents like Arlene Mendler have seen their peaceful surroundings transformed by construction noise and bright lights
1
. Water scarcity is another pressing issue, with concerns about how these data centers will impact local water supplies in drought-prone regions.Related Stories
As the energy crunch becomes more apparent, AI companies are exploring alternative power sources. There's a shift towards firm clean power, with companies like Microsoft and Amazon investing in nuclear and geothermal energy
2
.The AI energy crisis is not limited to the United States. In the UK, the government's AI ambitions are colliding with the reality of insufficient power infrastructure. Energy Secretary Ed Miliband advocates for a renewable energy future, but the country faces significant challenges in meeting the rapidly growing demand from data centers
3
.The UK is grappling with how to supply enough power for expanding data centers without causing blackouts or inflating consumer bills. The country's electricity demand is expected to increase by 50% by 2035 and more than double by 2050
3
.As AI continues to reshape our world, the energy sector faces unprecedented challenges. Balancing the need for computational power with environmental concerns and infrastructure limitations will be crucial in determining the sustainable future of AI technology.
Summarized by
Navi
[2]
[3]
1
Technology
2
Business and Economy
3
Business and Economy