New Software Tool Reduces AI Training Energy Waste by Up to 30%

3 Sources

Share

Researchers at the University of Michigan have developed Perseus, a software tool that can reduce energy consumption in AI training by up to 30% without compromising speed or performance, potentially saving enough energy to power 1.1 million U.S. homes by 2026.

News article

Energy Waste in AI Training

A new study from the University of Michigan has revealed that up to 30% of the power used to train large AI models, such as GPT-3, is wasted. This inefficiency stems from the unequal distribution of workload across multiple GPUs (Graphics Processing Units) during the training process

1

.

The Perseus Solution

To address this issue, researchers have developed a software tool called Perseus. This innovative solution identifies the critical path in AI training tasks and adjusts processor speeds accordingly, ensuring all processors finish their jobs simultaneously. By doing so, Perseus can reduce energy consumption by up to 30% without compromising training speed or model accuracy

2

.

Potential Impact

The energy savings achieved by Perseus could be substantial. Based on Wells Fargo's projections of AI power demand, the approach could save enough energy to power 1.1 million U.S. homes in 2026. This reduction in energy consumption could also help mitigate the environmental impact of data centers, which the International Monetary Fund predicts could account for 1.2% of global carbon emissions by 2027

3

.

The Need for Efficiency

Mosharaf Chowdhury, associate professor of computer science and engineering at the University of Michigan, emphasizes the importance of this development: "We can't keep building bigger and bigger data centers because we won't have the power to run them. If we can reduce the energy consumed by AI, we can reduce AI's carbon footprint and cooling requirements and allow for more computation to fit within our current energy constraints"

1

.

How Perseus Works

Perseus tackles the inefficiency created when AI training tasks are unevenly distributed across multiple processors. Current methods run all processors at top speed, resulting in some finishing their calculations before others. Perseus identifies the longest series of subtasks (the critical path) and slows down processors not on this path, ensuring all processors complete their work simultaneously and eliminating unnecessary power use

2

.

Implications for AI Accessibility

The researchers argue that reducing AI power costs could have significant implications for equitable AI access. Chowdhury notes, "If a country doesn't have enough power to run a big model, they might need to use services from far away, or be stuck running smaller, less accurate models. This gap could further perpetuate disparity between different communities"

3

.

Testing and Availability

The team has tested Perseus by training GPT-3, three other large language models, and one computer vision model. Perseus is now available as an open-source tool, part of Zeus, which measures and optimizes AI energy consumption

1

.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo