AI data centers slash power draw by 40% on demand, UK trial shows grid flexibility is possible

5 Sources

Share

A groundbreaking UK trial demonstrates AI data centers can reduce power consumption by up to 40% in real-time without disrupting critical workloads. The five-day test involving Nvidia, National Grid, and others shows how flexible power management could accelerate data center deployments while preventing grid overload during peak demand periods.

AI Data Centers Demonstrate Real-Time Power Flexibility

AI data centers can dramatically reduce power consumption on demand without compromising critical operations, according to results from a UK trial that could reshape how these facilities integrate with electricity grids worldwide. The five-day test conducted in December 2025 involved over 200 simulated grid events at a London facility operated by Nebius, demonstrating that AI infrastructure can cut power draw by up to 40% while maintaining essential workloads

2

. The trial brought together National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute (EPRI) to test whether AI data centers could function as a grid-aware asset rather than the always-on power drains they're typically assumed to be

1

.

Source: Engadget

Source: Engadget

The cluster of Nvidia Blackwell Ultra GPUs achieved 100% compliance with all requested power targets and ramp rates during testing. In one striking demonstration, operators reduced electricity use by roughly a third within about a minute of receiving a signal from the power grid

3

. Another test saw the facility reduce demand by 10% for 10 hours, while a particularly aggressive scenario required cutting load by 30% in just 30 seconds

4

. The 130 kW compute cluster used in the trial, equivalent to the power consumption of roughly 400 UK households, even responded successfully to "surprise" signals with no advance notice

2

.

How AI Data Centers Flexibly Adjust Power Use

The ability to reduce power draw on demand relies on sophisticated workload management rather than simply shutting down infrastructure. Power control is achieved primarily by pausing or deprioritizing jobs running on GPUs, or shifting AI workloads to later times

2

. While some AI workloads like inference are latency-sensitive and require immediate processing, others including training and fine-tuning are more throughput-intensive and include natural "flex points" like checkpoint intervals where processing can be paused without data loss

2

. The trial used commercially representative AI training workloads including gpt-oss, Llama, and Qwen models to approximate production-grade conditions.

Source: Bloomberg

Source: Bloomberg

The system tested proved highly responsive and didn't require extensive advanced scheduling. Testing showed it took less than a minute for operators to reduce power consumption to around 66%

1

. The facility even successfully responded to simulated demand spikes during soccer match halftimes, mimicking the phenomenon known as "The Great British Kettle Surge" where millions of households simultaneously boil kettles during breaks in popular TV events, creating sudden peaks in power demand

1

.

Managing High Energy Demands Through Grid Reinforcement Alternatives

The significance extends beyond simple power reduction—it's about flexibility that could fundamentally change how AI data centers connect to electricity infrastructure. Grids from the US to Europe are grappling with surges in applications from power-hungry facilities, with some UK projects facing waits of over a decade for grid access

3

. These facilities typically apply for connections based on their maximum possible demand, requiring networks to build enough infrastructure to serve that peak load at all times. But if operators agree to curb usage during peak times, networks may not need to be built to meet theoretical maximum demand continuously, potentially reducing grid reinforcement requirements and lowering balancing costs

3

.

Source: Tom's Hardware

Source: Tom's Hardware

"We would love to get to a point where we can get customers on the network in two years, and this is part of that," said Steve Smith, president of National Grid Partners

3

. This approach could accelerate connection timelines significantly, as the lack of available power grid capacity has been hampering deployment of more AI GPUs

1

. For companies racing to secure "time to power," flexibility may prove an acceptable trade-off for faster deployment of new data centers, according to Varun Sivaram, founder and CEO of Emerald AI

3

.

Peak Demand Management Creates Win-Win Scenario

This represents a win-win arrangement if both AI data centers and electricity grid operators can agree to implement such systems. Data centers could expedite connecting their infrastructure to the power grid while utility companies could maximize capacity even during off-peak hours

1

. Instead of pouring in more electricity from power plants during demand spikes, operators could ask AI data centers—among the biggest power consumers—to temporarily reduce their draw

1

. The approach also allows facilities to absorb surplus renewable power when demand falls, supporting grid stability

3

.

"This trial proves that NVIDIA-powered infrastructure can act as a grid-aware asset, modulating demand in real-time to support stability," said Josh Paker, Nvidia's sustainability lead

5

. The study will serve as a blueprint for a 100MW power-flexible AI factory that Nvidia plans to operate in Virginia

4

. The organizations involved plan to share their data with the AI industry, regulators, and policymakers to influence future approaches to data center integration. Google is already implementing similar flexible load practices in the US, pausing non-essential AI workloads to protect power grids

2

. As governments view data centers as engines of economic growth but face mounting grid bottlenecks and upgrade costs, demonstrating that large AI facilities can provide flexible load capabilities rather than simply strain systems could transform how they're integrated into energy infrastructure worldwide

3

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo