5 Sources
5 Sources
[1]
Nvidia-backed trial shows AI data centers can flexibly adjust power use in near real time, with global implications for energy consumption -- suggests hyperscalers can reduce consumption as necessary, ensuring grid isn't overloaded during peak demand
This is a win-win short-term solution for both AI hyperscalers and grid operators. A U.K. study run by the National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute (EPRI) just showed that AI data centers do not need to run at peak demand at all times. According to Bloomberg, the report suggests that hyperscalers can quickly adjust their power consumption as needed, ensuring that the grid is not overloaded when other consumers increase demand, while also using excess power from renewable energy when demand falls. If AI tech companies agree to this arrangement, it could allow them to go online much sooner, especially as the lack of available grid power has been hampering the deployment of more AI GPUs. The system tested is very flexible and doesn't require much advanced scheduling. Testing showed that it took less than a minute for a data center operator to reduce the amount of power it used to around 66%, and that one data center could even run at just 10% capacity for 10 hours. This isn't an ideal scenario for many hyperscalers, especially as they need as much power as they can get to maximize their expensive GPUs. However, insisting on this capability means that they'll have to wait for the local grid to catch up (which can take years, if not decades), or they'll have to deploy their own onsite generators (which can get expensive, unless they have deep pockets like OpenAI, and come with their own set of challenges). This would be a win-win situation if both data centers and electricity grid operators could agree to run a system like this, as it would allow the former to expedite connecting their infrastructure to power while the latter could maximize their capacity even during off-peak hours. In fact, many utility companies already have a system that monitors spikes in demand and supercharges the grid as needed. There is a phenomenon called "The Great British Kettle Surge," wherein U.K. power providers prepare the electricity grid for the expected surge during breaks on popular TV events like the World Cup. This happens because millions of households boil their kettles simultaneously during these breaks, resulting in a sudden peak in power demand for a few minutes. Now, instead of pouring in more electricity from power plants, which we're already in short supply of, the operators could just ask AI data centers, which are one of, if not the biggest, power consumers, to reduce their demand. It might take some time and maybe some push from regulators before this can be implemented, but it's a good short-term solution that will allow data centers to go online as soon as possible without breaking the grid. Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
[2]
Flex appeal: UK datacenter cuts AI power draw 40% on command
London GPU farm dances to National Grid's tune in five-day trial, critical workloads not disrupted A UK datacenter has successfully demonstrated it can reduce the amount of power drawn by AI infrastructure in response to grid events, without disrupting critical workloads. The trial, conducted over five days last December, involved a cluster of Nvidia Blackwell Ultra GPUs installed in a datacenter near London operated by GPU-as-a-service biz Nebius. This involved more than 200 simulated grid event notifications, sent to the site to test its ability to dynamically adjust the cluster's power consumption. This was achieved successfully, cutting power demand by up to 40 percent while key tasks continued to run as normal, according to energy provider National Grid. As well as National Grid and Nebius, the project involves Emerald AI, which supplies the software, and Electric Power Research Institute (EPRI) with its Datacenter Flexible Load Initiative (DCFlex). A whitepaper provided by National Grid reveals that power control is largely achieved by pausing or deprioritizing jobs running on the GPUs, or shifting workloads to a later time, rather than the blunt instrument of powering down parts of the infrastructure. Google is already doing this in the US, and said last year it will pause non-essential AI workloads to protect power grids. While some AI workloads - such as inference - are latency sensitive, others including training and fine-tuning are more throughput-intensive. These latter tasks also typically include natural "flex points" like checkpoint intervals, where processing can be paused, the whitepaper explains. To approximate production-grade conditions, Emerald AI and Nebius chose a set of commercially representative AI training workloads, including the gpt-oss, Llama, and Qwen models. The cluster was kept continuously utilized running these. For the power experiments, National Grid Electricity Transmission (NGET) and EPRI submitted grid signals through an event submission portal, specifying the notice period, power reduction percentage, ramp-down duration, ramp-up duration, and overall event duration. Some of the tests involved "surprise" signals, where no advance notice was given, or specifying no ramp time, requiring immediate response. Some also emulated real-world spikes in demand seen when thirsty Brits put the kettle on for a brew during half time of major football matches. The tests were carried out as the Nebius "AI Factory" was being brought online, and involved a 130 kW compute cluster, roughly equivalent to the power consumption of 400 UK households. According to the whitepaper, the cluster achieved 100 percent compliance with all requested power targets and ramp rates, suggesting these systems could help alleviate the problems caused by AI's huge power consumption. By replacing the rigid "firm load" models of the past with measurement-based flexibility, grid operators and policymakers can create new options for delivering capacity efficiently, the report concludes. "As the UK's digital economy accelerates, there's concern that datacenters could add pressure to an already constrained system. This trial proves the opposite can be true. High‑performance datacenters don't have to place additional strain on the grid," claimed National Grid Partners president Steve Smith. That assumes you can get the power in the first place. As The Register has reported previously, new generating capacity isn't being added at the same rate datacenters are being built, and some developers complain they face a wait of years to get a grid connection and for local substations to be upgraded. ®
[3]
AI Data Centers May Not Need Constant Peak Power, Study Finds
The flexibility of AI data centers could reduce the amount of grid reinforcement needed to connect them, potentially lowering balancing costs and speeding up connection timelines. AI data centers can operate without using peak power continuously, according to the results of a UK trial, a finding that could have implications for electricity systems worldwide. Grids from the US to Europe are grappling with a surge in applications from power-hungry data centers used to run artificial intelligence workloads. In the UK, some projects face waits of over a decade for grid access. Typically, these facilities apply for connections based on their maximum possible demand, requiring networks to build enough infrastructure to serve that peak load at all times. But a trial involving National Grid Plc., Nvidia Corp., Emerald AI and EPRI indicates AI data centers can quickly adjust their consumption when asked. At a London site, operators cut electricity use by roughly a third within about a minute of receiving a signal from the grid, without disrupting workloads. In one live test, an AI facility reduced demand by 10% for 10 hours. The significance isn't simply lower power use. It's flexibility. The trial showed AI workloads can be dialed up or down in near real time, meaning data centers may not require perfectly steady supply. Instead of acting as rigid, always-on loads, data centers could respond to grid conditions, easing demand during system stress or absorbing surplus renewable power. That shift could reduce the amount of grid reinforcement needed to connect them. If operators agree to curb usage at peak times, networks may not have to be built to meet their theoretical maximum demand continuously. In turn, that could lower balancing costs and potentially speed up connection timelines. "We would love to get to a point where we can get customers on the network in two years, and this is part of that," said Steve Smith, the president of National Grid Partners. The implications extend beyond the UK. Governments see data centers as engines of economic growth, but grid bottlenecks and upgrade costs are mounting. Demonstrating that large AI facilities can provide flexibility -- rather than simply strain the system -- could change how they're integrated. The remaining hurdle may be persuading operators that allowing their load to be modulated is worth the trade-off. For companies racing to secure "time to power," flexibility may prove an acceptable price for faster access to the grid, said Varun Sivaram, the founder and chief executive officer of Emerald AI.
[4]
AI data centers could reduce power draw on demand, study says
Apparently, AI data centers are capable of sucking less (power, that is). A recent UK trial demonstrated that they can adjust their energy demands dynamically without disrupting critical workloads. This contrasts with data centers' current approach of always-on power draw, which can strain grids and drive up prices for everyone. Over five days in December 2025, more than 200 simulated "grid events" tested a London data center's ability to adjust its energy use on the fly. The trial used software from Emerald AI, which was involved in the study. Other partners included NVIDIA, National Grid, Nebius and the nonprofit Electric Power Research Institute. In each simulated grid event, the data center successfully adjusted its energy use to the requested level. It reduced power draw by up to 40 percent, while critical workloads continued to run as normal throughout the trial. The data center successfully reacted to spikes in demand during soccer match halftimes. In one case, it reduced its power draw by 10 percent for up to 10 hours. It also managed to cut its demand quickly: One event saw the data center reduce its load by 30 percent in only 30 seconds. The study will serve as a blueprint for a 100MW "power-flexible AI factory" that NVIDIA plans to operate in Virginia. "This trial proves that NVIDIA-powered infrastructure can act as a grid-aware asset, modulating demand in real-time to support stability," Josh Paker, NVIDIA's sustainability lead, wrote in a statement. "By making AI workloads responsive, we accelerate deployment while reducing the need for costly grid upgrades." The organizations involved in the study say they'll share their data with the AI industry, regulators and policymakers to try to influence their approach. Fortunately, we don't need to hope that data center operators' altruism (ha) will lead to their cooperation. Agreeing to curb usage during peak demand could be good for their balance sheets and lead to faster approvals for new data center grid connections. "We would love to get to a point where we can get customers on the network in two years, and this is part of that," Steve Smith, president of National Grid Partners, told Bloomberg.
[5]
UK trial shows AI data centers can cut power draw by 40%
UK trial shows AI data centers can cut power draw by up to 40 percent without disrupting workloads. The capability contrasts with the current always-on approach that strains grids. The five-day test in December 2025 involved over 200 simulated grid events at a London facility. Partners included Emerald AI, NVIDIA, National Grid, Nebius, and the Electric Power Research Institute. The data center reduced power draw by up to 40 percent in each event while critical operations continued. The facility reacted to demand spikes, such as during soccer match halftimes, reducing power by 10 percent for up to 10 hours. In one event, the center cut its load by 30 percent in 30 seconds. "This trial proves that NVIDIA-powered infrastructure can act as a grid-aware asset, modulating demand in real-time to support stability," said Josh Paker, NVIDIA's sustainability lead. Paker stated that making AI workloads responsive accelerates deployment while reducing the need for costly grid upgrades. The study will serve as a blueprint for a 100MW power-flexible AI factory NVIDIA plans to operate in Virginia. The organizations involved plan to share data with the AI industry, regulators, and policymakers. Agreeing to curb usage during peak demand could lead to faster approvals for new data center grid connections. "We would love to get to a point where we can get customers on the network in two years, and this is part of that," said Steve Smith, president of National Grid Partners, according to Bloomberg.
Share
Share
Copy Link
A groundbreaking UK trial demonstrates AI data centers can reduce power consumption by up to 40% in real-time without disrupting critical workloads. The five-day test involving Nvidia, National Grid, and others shows how flexible power management could accelerate data center deployments while preventing grid overload during peak demand periods.
AI data centers can dramatically reduce power consumption on demand without compromising critical operations, according to results from a UK trial that could reshape how these facilities integrate with electricity grids worldwide. The five-day test conducted in December 2025 involved over 200 simulated grid events at a London facility operated by Nebius, demonstrating that AI infrastructure can cut power draw by up to 40% while maintaining essential workloads
2
. The trial brought together National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute (EPRI) to test whether AI data centers could function as a grid-aware asset rather than the always-on power drains they're typically assumed to be1
.
Source: Engadget
The cluster of Nvidia Blackwell Ultra GPUs achieved 100% compliance with all requested power targets and ramp rates during testing. In one striking demonstration, operators reduced electricity use by roughly a third within about a minute of receiving a signal from the power grid
3
. Another test saw the facility reduce demand by 10% for 10 hours, while a particularly aggressive scenario required cutting load by 30% in just 30 seconds4
. The 130 kW compute cluster used in the trial, equivalent to the power consumption of roughly 400 UK households, even responded successfully to "surprise" signals with no advance notice2
.The ability to reduce power draw on demand relies on sophisticated workload management rather than simply shutting down infrastructure. Power control is achieved primarily by pausing or deprioritizing jobs running on GPUs, or shifting AI workloads to later times
2
. While some AI workloads like inference are latency-sensitive and require immediate processing, others including training and fine-tuning are more throughput-intensive and include natural "flex points" like checkpoint intervals where processing can be paused without data loss2
. The trial used commercially representative AI training workloads including gpt-oss, Llama, and Qwen models to approximate production-grade conditions.
Source: Bloomberg
The system tested proved highly responsive and didn't require extensive advanced scheduling. Testing showed it took less than a minute for operators to reduce power consumption to around 66%
1
. The facility even successfully responded to simulated demand spikes during soccer match halftimes, mimicking the phenomenon known as "The Great British Kettle Surge" where millions of households simultaneously boil kettles during breaks in popular TV events, creating sudden peaks in power demand1
.The significance extends beyond simple power reduction—it's about flexibility that could fundamentally change how AI data centers connect to electricity infrastructure. Grids from the US to Europe are grappling with surges in applications from power-hungry facilities, with some UK projects facing waits of over a decade for grid access
3
. These facilities typically apply for connections based on their maximum possible demand, requiring networks to build enough infrastructure to serve that peak load at all times. But if operators agree to curb usage during peak times, networks may not need to be built to meet theoretical maximum demand continuously, potentially reducing grid reinforcement requirements and lowering balancing costs3
.
Source: Tom's Hardware
"We would love to get to a point where we can get customers on the network in two years, and this is part of that," said Steve Smith, president of National Grid Partners
3
. This approach could accelerate connection timelines significantly, as the lack of available power grid capacity has been hampering deployment of more AI GPUs1
. For companies racing to secure "time to power," flexibility may prove an acceptable trade-off for faster deployment of new data centers, according to Varun Sivaram, founder and CEO of Emerald AI3
.Related Stories
This represents a win-win arrangement if both AI data centers and electricity grid operators can agree to implement such systems. Data centers could expedite connecting their infrastructure to the power grid while utility companies could maximize capacity even during off-peak hours
1
. Instead of pouring in more electricity from power plants during demand spikes, operators could ask AI data centers—among the biggest power consumers—to temporarily reduce their draw1
. The approach also allows facilities to absorb surplus renewable power when demand falls, supporting grid stability3
."This trial proves that NVIDIA-powered infrastructure can act as a grid-aware asset, modulating demand in real-time to support stability," said Josh Paker, Nvidia's sustainability lead
5
. The study will serve as a blueprint for a 100MW power-flexible AI factory that Nvidia plans to operate in Virginia4
. The organizations involved plan to share their data with the AI industry, regulators, and policymakers to influence future approaches to data center integration. Google is already implementing similar flexible load practices in the US, pausing non-essential AI workloads to protect power grids2
. As governments view data centers as engines of economic growth but face mounting grid bottlenecks and upgrade costs, demonstrating that large AI facilities can provide flexible load capabilities rather than simply strain systems could transform how they're integrated into energy infrastructure worldwide3
.Summarized by
Navi
[2]
[5]
1
Technology

2
Technology

3
Technology
