AI data centers can cut power use 40% in seconds, UK trial shows, easing grid strain globally

3 Sources

Share

A groundbreaking UK trial involving National Grid, Nvidia, Emerald AI, and EPRI demonstrated that AI data centers can flexibly adjust power use in near real-time, reducing consumption by up to 40% within 30 seconds without disrupting workloads. The findings suggest hyperscalers could help manage peak demand while accelerating their own deployment timelines, potentially transforming how power-hungry AI infrastructure connects to electricity grids worldwide.

AI Data Centers Demonstrate Unprecedented Grid Flexibility

AI data centers can dramatically reduce power consumption on demand without disrupting critical operations, according to results from a UK trial that could reshape how electricity grids worldwide handle the high energy demands of AI

2

. The study, conducted by National Grid, Nvidia, Emerald AI, and the Electric Power Research Institute, ran over five days in December 2025 at a London facility, testing more than 200 simulated grid events

3

. In one striking demonstration, operators cut electricity use by roughly a third within about a minute of receiving a signal from the grid, while in another test, the facility reduced demand by 10% for 10 hours

2

. The most impressive result showed the data center could reduce its power draw by up to 40% in just 30 seconds

3

.

Source: Engadget

Source: Engadget

The significance extends far beyond simple power savings. This grid flexibility fundamentally challenges how AI data centers connect to energy infrastructure. Currently, these facilities apply for connections based on their maximum possible demand, requiring networks to build enough infrastructure to serve that peak load at all times

2

. The trial showed AI workloads can be dialed up or down in near real-time, meaning data centers may not require perfectly steady supply and could instead respond to grid conditions, easing demand during system stress or absorbing surplus renewable power

2

.

Accelerating AI Infrastructure Deployment While Reducing Grid Strain

For hyperscalers racing to deploy AI infrastructure, this capability could dramatically accelerate connection timelines. In the UK, some data center projects currently face waits of over a decade for grid access

2

. Steve Smith, president of National Grid Partners, expressed optimism about the potential impact: "We would love to get to a point where we can get customers on the network in two years, and this is part of that"

2

. If grid operators agree to curb usage at peak times, networks may not have to be built to meet their theoretical maximum demand continuously, potentially lowering balancing costs and enabling faster deployment of new data centers

2

.

The trial successfully demonstrated real-world applications, including the data center's ability to reduce power draw during soccer match halftimes when millions of households simultaneously increase electricity use

3

. This mirrors the "Great British Kettle Surge" phenomenon, where UK power providers prepare for expected surges during breaks in popular TV events like the World Cup

1

. Now, instead of drawing more electricity from already-strained power plants, grid operators could ask AI data centers to temporarily reduce their demand

1

.

Source: Tom's Hardware

Source: Tom's Hardware

Global Implications and Future Plans

The findings will serve as a blueprint for a 100MW power-flexible AI factory that Nvidia plans to operate in Virginia

3

. Josh Paker, Nvidia's sustainability lead, stated: "This trial proves that Nvidia-powered infrastructure can act as a grid-aware asset, modulating demand in real-time to support stability. By making AI workloads responsive, we accelerate deployment while reducing the need for costly grid upgrades"

3

. The organizations involved plan to share their data with the AI industry, regulators and policymakers to influence future approaches

3

.

For companies racing to secure "time to power," flexibility may prove an acceptable price for faster access to the grid, according to Varun Sivaram, founder and CEO of Emerald AI

2

. While this approach may not be ideal for hyperscalers needing maximum power to optimize their expensive GPUs, the alternative is waiting years or even decades for local grids to catch up, or deploying costly onsite generators

1

. The implications extend beyond the UK, as grids from the US to Europe grapple with surging applications from power-hungry data centers, and governments see these facilities as engines of economic growth despite mounting grid bottlenecks and upgrade costs

2

. Demonstrating that large AI facilities can provide grid stability rather than simply strain the system could fundamentally change how they're integrated into energy infrastructure worldwide.

Source: Bloomberg

Source: Bloomberg

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo