2 Sources
[1]
Tech giants scramble to meet AI's looming energy crisis
The artificial intelligence industry is scrambling to reduce its massive energy consumption through better cooling systems, more efficient computer chips, and smarter programming -- all while AI usage explodes worldwide. AI depends entirely on data centers, which could consume 3% of the world's electricity by 2030, according to the International Energy Agency. That's double what they use today. Experts at McKinsey, a US consulting firm, describe a race to build enough data centers to keep up with AI's rapid growth, while warning that the world is heading toward an electricity shortage. "There are several ways of solving the problem," explained Mosharaf Chowdhury, a University of Michigan professor of computer science. Companies can either build more energy supply -- which takes time and the AI giants are already scouring the globe to do -- or figure out how to consume less energy for the same computing power. Chowdhury believes the challenge can be met with "clever" solutions at every level, from the physical hardware to the AI software itself. For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30%. 'Clever' solutions Twenty years ago, operating a data center -- encompassing cooling systems and other infrastructure -- required as much energy as running the servers themselves. Today, operations use just 10% of what the servers consume, says Gareth Williams from consulting firm Arup. This is largely through this focus on energy efficiency. Many data centers now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly. This allows them to optimize water and electricity use in real-time, according to McKinsey's Pankaj Sachdeva. For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers. "All the big players are looking at it," Williams said. This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago. Amazon's world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers -- - avoiding having to rebuild existing data centers. "There simply wouldn't be enough liquid-cooling capacity to support our scale," Dave Brown, vice president of compute and machine learning services at AWS, said in a YouTube video. US vs. China For McKinsey's Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last. Research by Purdue University's Yi Ding has shown that AI chips can last longer without losing performance. "But it's hard to convince semiconductor companies to make less money" by encouraging customers to keep using the same equipment longer, Ding added. Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won't reduce total energy consumption. "Energy consumption will keep rising," Ding predicted, despite all efforts to limit it. "But maybe not as quickly." In the United States, energy is now seen as key to keeping the country's competitive edge over China in AI. In January, Chinese startup DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips -- and by extension, less energy. DeepSeek's engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential. China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.
[2]
Tech giants scramble to meet AI's looming energy crisis
New York (AFP) - The artificial intelligence industry is scrambling to reduce its massive energy consumption through better cooling systems, more efficient computer chips, and smarter programming -- all while AI usage explodes worldwide. AI depends entirely on data centers, which could consume three percent of the world's electricity by 2030, according to the International Energy Agency. That's double what they use today. Experts at McKinsey, a US consulting firm, describe a race to build enough data centers to keep up with AI's rapid growth, while warning that the world is heading toward an electricity shortage. "There are several ways of solving the problem," explained Mosharaf Chowdhury, a University of Michigan professor of computer science. Companies can either build more energy supply -- which takes time and the AI giants are already scouring the globe to do -- or figure out how to consume less energy for the same computing power. Chowdhury believes the challenge can be met with "clever" solutions at every level, from the physical hardware to the AI software itself. For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30 percent. 'Clever' solutions Twenty years ago, operating a data center -- encompassing cooling systems and other infrastructure -- required as much energy as running the servers themselves. Today, operations use just 10 percent of what the servers consume, says Gareth Williams from consulting firm Arup. This is largely through this focus on energy efficiency. Many data centers now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly. This allows them to optimize water and electricity use in real-time, according to McKinsey's Pankaj Sachdeva. For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers. "All the big players are looking at it," Williams said. This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago. Amazon's world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers - - avoiding have to rebuild existing data centers. "There simply wouldn't be enough liquid-cooling capacity to support our scale," Dave Brown, vice president of compute and machine learning services at AWS, said in a YouTube video. US vs China For McKinsey's Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last. Research by Purdue University's Yi Ding has shown that AI chips can last longer without losing performance. "But it's hard to convince semiconductor companies to make less money" by encouraging customers to keep using the same equipment longer, Ding added. Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won't reduce total energy consumption. "Energy consumption will keep rising," Ding predicted, despite all efforts to limit it. "But maybe not as quickly." In the United States, energy is now seen as key to keeping the country's competitive edge over China in AI. In January, Chinese startup DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips -- and by extension, less energy. DeepSeek's engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential. China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.
Share
Copy Link
As AI usage skyrockets, the tech industry is urgently seeking solutions to reduce the massive energy consumption of data centers, which are projected to double their electricity usage by 2030.
The artificial intelligence industry is facing a significant challenge as it grapples with its escalating energy consumption. According to the International Energy Agency, data centers, which are crucial for AI operations, could consume 3% of the world's electricity by 2030, doubling their current usage 12. This surge in energy demand has prompted tech giants to scramble for solutions to mitigate the looming energy crisis.
Source: Tech Xplore
Tech companies are exploring various strategies to address this issue:
Improved Cooling Systems: Many data centers now employ AI-powered sensors to control temperature in specific zones, optimizing water and electricity use in real-time 12.
Liquid Cooling: This game-changing technology replaces energy-hungry air conditioners with coolant that circulates directly through servers. Amazon's AWS has developed its own liquid method to cool Nvidia GPUs in its servers 12.
Efficient Computer Chips: Each new generation of computer chips is becoming more energy-efficient. Research by Purdue University's Yi Ding has shown that AI chips can last longer without losing performance 12.
Smarter Programming: Mosharaf Chowdhury, a University of Michigan professor, believes that "clever" solutions at every level, from hardware to software, can address the energy challenge. His lab has developed algorithms that reduce energy use by 20-30% by calculating the exact electricity needs of each AI chip 12.
The quest for energy efficiency in AI is not just about environmental concerns; it's also a matter of global competitiveness. In the United States, energy is now seen as key to maintaining the country's edge over China in AI development 12.
Chinese startup DeepSeek recently unveiled an AI model that performs as well as top US systems while using less powerful chips and, consequently, less energy. This was achieved through precise GPU programming and by skipping an energy-intensive training step 12.
Despite efforts to limit energy consumption, experts predict that overall energy use will continue to rise, albeit potentially at a slower rate. The challenge lies in balancing the rapid growth of AI with sustainable energy practices 12.
McKinsey experts warn of a potential global electricity shortage as the race to build data centers intensifies. This situation underscores the need for innovative solutions and strategic planning in the AI industry 12.
As the competition between the US and China heats up, concerns have been raised about China's potential advantage in available energy sources, including renewables and nuclear power 12.
The AI energy crisis presents both challenges and opportunities for the tech industry. As companies strive to reduce their energy footprint, we may see significant advancements in energy-efficient technologies that could have far-reaching implications beyond the AI sector.
Summarized by
Navi
[1]
OpenAI introduces ChatGPT Agent, a powerful AI assistant capable of performing complex tasks across multiple platforms, marking a significant advancement in agentic AI technology.
26 Sources
Technology
4 hrs ago
26 Sources
Technology
4 hrs ago
Taiwan Semiconductor Manufacturing Co. (TSMC) posts record quarterly profit driven by strong AI chip demand, raising its 2025 revenue growth forecast to 30% despite potential challenges.
7 Sources
Technology
4 hrs ago
7 Sources
Technology
4 hrs ago
Slack introduces a suite of AI-driven tools to improve search, summarization, and communication within its platform, aiming to streamline workplace collaboration and compete with other tech giants in the enterprise productivity space.
9 Sources
Technology
4 hrs ago
9 Sources
Technology
4 hrs ago
Nvidia and AMD are set to resume sales of AI chips to China as part of a broader US-China trade deal involving rare earth elements, sparking debates on national security and technological competition.
3 Sources
Policy and Regulation
12 hrs ago
3 Sources
Policy and Regulation
12 hrs ago
Google introduces advanced AI capabilities to Search, including Gemini 2.5 Pro integration, Deep Search for comprehensive research, and an AI agent for business inquiries.
3 Sources
Technology
5 hrs ago
3 Sources
Technology
5 hrs ago