3 Sources
3 Sources
[1]
Nvidia-backed trial shows AI data centers can flexibly adjust power use in near real time, with global implications for energy consumption -- suggests hyperscalers can reduce consumption as necessary, ensuring grid isn't overloaded during peak demand
This is a win-win short-term solution for both AI hyperscalers and grid operators. A U.K. study run by the National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute (EPRI) just showed that AI data centers do not need to run at peak demand at all times. According to Bloomberg, the report suggests that hyperscalers can quickly adjust their power consumption as needed, ensuring that the grid is not overloaded when other consumers increase demand, while also using excess power from renewable energy when demand falls. If AI tech companies agree to this arrangement, it could allow them to go online much sooner, especially as the lack of available grid power has been hampering the deployment of more AI GPUs. The system tested is very flexible and doesn't require much advanced scheduling. Testing showed that it took less than a minute for a data center operator to reduce the amount of power it used to around 66%, and that one data center could even run at just 10% capacity for 10 hours. This isn't an ideal scenario for many hyperscalers, especially as they need as much power as they can get to maximize their expensive GPUs. However, insisting on this capability means that they'll have to wait for the local grid to catch up (which can take years, if not decades), or they'll have to deploy their own onsite generators (which can get expensive, unless they have deep pockets like OpenAI, and come with their own set of challenges). This would be a win-win situation if both data centers and electricity grid operators could agree to run a system like this, as it would allow the former to expedite connecting their infrastructure to power while the latter could maximize their capacity even during off-peak hours. In fact, many utility companies already have a system that monitors spikes in demand and supercharges the grid as needed. There is a phenomenon called "The Great British Kettle Surge," wherein U.K. power providers prepare the electricity grid for the expected surge during breaks on popular TV events like the World Cup. This happens because millions of households boil their kettles simultaneously during these breaks, resulting in a sudden peak in power demand for a few minutes. Now, instead of pouring in more electricity from power plants, which we're already in short supply of, the operators could just ask AI data centers, which are one of, if not the biggest, power consumers, to reduce their demand. It might take some time and maybe some push from regulators before this can be implemented, but it's a good short-term solution that will allow data centers to go online as soon as possible without breaking the grid. Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
[2]
AI Data Centers May Not Need Constant Peak Power, Study Finds
The flexibility of AI data centers could reduce the amount of grid reinforcement needed to connect them, potentially lowering balancing costs and speeding up connection timelines. AI data centers can operate without using peak power continuously, according to the results of a UK trial, a finding that could have implications for electricity systems worldwide. Grids from the US to Europe are grappling with a surge in applications from power-hungry data centers used to run artificial intelligence workloads. In the UK, some projects face waits of over a decade for grid access. Typically, these facilities apply for connections based on their maximum possible demand, requiring networks to build enough infrastructure to serve that peak load at all times. But a trial involving National Grid Plc., Nvidia Corp., Emerald AI and EPRI indicates AI data centers can quickly adjust their consumption when asked. At a London site, operators cut electricity use by roughly a third within about a minute of receiving a signal from the grid, without disrupting workloads. In one live test, an AI facility reduced demand by 10% for 10 hours. The significance isn't simply lower power use. It's flexibility. The trial showed AI workloads can be dialed up or down in near real time, meaning data centers may not require perfectly steady supply. Instead of acting as rigid, always-on loads, data centers could respond to grid conditions, easing demand during system stress or absorbing surplus renewable power. That shift could reduce the amount of grid reinforcement needed to connect them. If operators agree to curb usage at peak times, networks may not have to be built to meet their theoretical maximum demand continuously. In turn, that could lower balancing costs and potentially speed up connection timelines. "We would love to get to a point where we can get customers on the network in two years, and this is part of that," said Steve Smith, the president of National Grid Partners. The implications extend beyond the UK. Governments see data centers as engines of economic growth, but grid bottlenecks and upgrade costs are mounting. Demonstrating that large AI facilities can provide flexibility -- rather than simply strain the system -- could change how they're integrated. The remaining hurdle may be persuading operators that allowing their load to be modulated is worth the trade-off. For companies racing to secure "time to power," flexibility may prove an acceptable price for faster access to the grid, said Varun Sivaram, the founder and chief executive officer of Emerald AI.
[3]
AI data centers could reduce power draw on demand, study says
Apparently, AI data centers are capable of sucking less (power, that is). A recent UK trial demonstrated that they can adjust their energy demands dynamically without disrupting critical workloads. This contrasts with data centers' current approach of always-on power draw, which can strain grids and drive up prices for everyone. Over five days in December 2025, more than 200 simulated "grid events" tested a London data center's ability to adjust its energy use on the fly. The trial used software from Emerald AI, which was involved in the study. Other partners included NVIDIA, National Grid, Nebius and the nonprofit Electric Power Research Institute. In each simulated grid event, the data center successfully adjusted its energy use to the requested level. It reduced power draw by up to 40 percent, while critical workloads continued to run as normal throughout the trial. The data center successfully reacted to spikes in demand during soccer match halftimes. In one case, it reduced its power draw by 10 percent for up to 10 hours. It also managed to cut its demand quickly: One event saw the data center reduce its load by 30 percent in only 30 seconds. The study will serve as a blueprint for a 100MW "power-flexible AI factory" that NVIDIA plans to operate in Virginia. "This trial proves that NVIDIA-powered infrastructure can act as a grid-aware asset, modulating demand in real-time to support stability," Josh Paker, NVIDIA's sustainability lead, wrote in a statement. "By making AI workloads responsive, we accelerate deployment while reducing the need for costly grid upgrades." The organizations involved in the study say they'll share their data with the AI industry, regulators and policymakers to try to influence their approach. Fortunately, we don't need to hope that data center operators' altruism (ha) will lead to their cooperation. Agreeing to curb usage during peak demand could be good for their balance sheets and lead to faster approvals for new data center grid connections. "We would love to get to a point where we can get customers on the network in two years, and this is part of that," Steve Smith, president of National Grid Partners, told Bloomberg.
Share
Share
Copy Link
A groundbreaking UK trial involving National Grid, Nvidia, Emerald AI, and EPRI demonstrated that AI data centers can flexibly adjust power use in near real-time, reducing consumption by up to 40% within 30 seconds without disrupting workloads. The findings suggest hyperscalers could help manage peak demand while accelerating their own deployment timelines, potentially transforming how power-hungry AI infrastructure connects to electricity grids worldwide.
AI data centers can dramatically reduce power consumption on demand without disrupting critical operations, according to results from a UK trial that could reshape how electricity grids worldwide handle the high energy demands of AI
2
. The study, conducted by National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute, ran over five days in December 2025 at a London facility, testing more than 200 simulated grid events3
. In one striking demonstration, operators cut electricity use by roughly a third within about a minute of receiving a signal from the grid, while in another test, the facility reduced demand by 10% for 10 hours2
. The most impressive result showed the data center could reduce its power draw by up to 40% in just 30 seconds3
.
Source: Engadget
The significance extends far beyond simple power savings. This grid flexibility fundamentally challenges how AI data centers connect to energy infrastructure. Currently, these facilities apply for connections based on their maximum possible demand, requiring networks to build enough infrastructure to serve that peak load at all times
2
. The trial showed AI workloads can be dialed up or down in near real-time, meaning data centers may not require perfectly steady supply and could instead respond to grid conditions, easing demand during system stress or absorbing surplus renewable power2
.For hyperscalers racing to deploy AI infrastructure, this capability could dramatically accelerate connection timelines. In the UK, some data center projects currently face waits of over a decade for grid access
2
. Steve Smith, president of National Grid Partners, expressed optimism about the potential impact: "We would love to get to a point where we can get customers on the network in two years, and this is part of that"2
. If grid operators agree to curb usage at peak times, networks may not have to be built to meet their theoretical maximum demand continuously, potentially lowering balancing costs and enabling faster deployment of new data centers2
.The trial successfully demonstrated real-world applications, including the data center's ability to reduce power draw during soccer match halftimes when millions of households simultaneously increase electricity use
3
. This mirrors the "Great British Kettle Surge" phenomenon, where UK power providers prepare for expected surges during breaks in popular TV events like the World Cup1
. Now, instead of drawing more electricity from already-strained power plants, grid operators could ask AI data centers to temporarily reduce their demand1
.
Source: Tom's Hardware
Related Stories
The findings will serve as a blueprint for a 100MW power-flexible AI factory that Nvidia plans to operate in Virginia
3
. Josh Paker, Nvidia's sustainability lead, stated: "This trial proves that Nvidia-powered infrastructure can act as a grid-aware asset, modulating demand in real-time to support stability. By making AI workloads responsive, we accelerate deployment while reducing the need for costly grid upgrades"3
. The organizations involved plan to share their data with the AI industry, regulators and policymakers to influence future approaches3
.For companies racing to secure "time to power," flexibility may prove an acceptable price for faster access to the grid, according to Varun Sivaram, founder and CEO of Emerald AI
2
. While this approach may not be ideal for hyperscalers needing maximum power to optimize their expensive GPUs, the alternative is waiting years or even decades for local grids to catch up, or deploying costly onsite generators1
. The implications extend beyond the UK, as grids from the US to Europe grapple with surging applications from power-hungry data centers, and governments see these facilities as engines of economic growth despite mounting grid bottlenecks and upgrade costs2
. Demonstrating that large AI facilities can provide grid stability rather than simply strain the system could fundamentally change how they're integrated into energy infrastructure worldwide.
Source: Bloomberg
Summarized by
Navi
05 Aug 2025•Technology

27 Jun 2025•Technology
15 Aug 2025•Business and Economy

1
Business and Economy

2
Policy and Regulation

3
Technology
