5 Sources
[1]
The biggest challenge for AI may be keeping the lights on
Artificial intelligence is already reshaping industries like finance and customer service, but Silicon Valley is setting its sights on something even bigger: superintelligence. This next evolution of AI, where companies aim to surpass the cognitive abilities of all humans combined, has not been realized. But it's still attracting billions of dollars in investment -- and stoking a growing concern: energy scarcity. In a new episode of the Moonshots podcast, former Google CEO Eric Schmidt said the real bottleneck to achieving artificial superintelligence isn't computing power or funding. It's electricity. "AI's natural limit is electricity, not chips," Schmidt said. "The U.S. is currently expected to need another 92 gigawatts of power to support the AI revolution." That's the equivalent of building roughly 92 new nuclear power stations, a tall order in a country that's only built two in the last three decades. The warning comes as tech giants like OpenAI, Meta, and Microsoft race to build AI experts in fields such as law, medicine, engineering, and research. Schmidt predicts this could happen within five years. The stakes are massive. As Wall Street piles into AI, drawn by its promise to automate tasks, boost productivity, and unlock new discoveries, superintelligence is seen as the ultimate prize. And the competition to reach it is fierce. Companies are now battling over top AI talent and securing massive energy contracts to stay ahead. Microsoft, for example, has already signed a 20-year power purchase agreement with Constellation Energy to restart Three Mile Island, a nuclear plant shuttered in 2019, with a target relaunch in 2028. And its latest environmental report shows another cost of current AI use: a 34% jump in water consumption to cool servers and keep data centers running, totaling 1.7 billion gallons in a single year. By 2027, researchers estimate AI workloads could consume up to 6.6 billion cubic meters of water, enough to supply all of Canada for over a year. Even Sam Altman, CEO of OpenAI, has acknowledged the energy challenge. "An energy breakthrough is essential for AI's future," he said last year. Altman has personally invested in Helion, a nuclear fusion startup aiming to build a pilot plant by 2028. Lawmakers are taking notice. In May, Microsoft and AMD urged Congress to fast-track permits for new energy projects to avoid overwhelming the U.S. power grid. Still, the environmental toll is raising alarms among climate groups. Greenpeace has warned that, without serious planning, AI's growth could derail national and global climate goals -- which most nations are already failing to meet. "We don't know what AI will deliver, and we certainly don't know what superintelligence will bring," Schmidt said in a LinkedIn post promoting the podcast, "but we know that it is coming fast. We need to plan ahead to ensure we have the energy needed to meet the many opportunities and challenges that AI puts before us." In other words: It's not enough to build the brains. We'll need to power them, too.
[2]
Tech giants scramble to meet AI's looming energy crisis
The artificial intelligence industry is scrambling to reduce its massive energy consumption through better cooling systems, more efficient computer chips, and smarter programming -- all while AI usage explodes worldwide. AI depends entirely on data centers, which could consume 3% of the world's electricity by 2030, according to the International Energy Agency. That's double what they use today. Experts at McKinsey, a US consulting firm, describe a race to build enough data centers to keep up with AI's rapid growth, while warning that the world is heading toward an electricity shortage. "There are several ways of solving the problem," explained Mosharaf Chowdhury, a University of Michigan professor of computer science. Companies can either build more energy supply -- which takes time and the AI giants are already scouring the globe to do -- or figure out how to consume less energy for the same computing power. Chowdhury believes the challenge can be met with "clever" solutions at every level, from the physical hardware to the AI software itself. For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30%. 'Clever' solutions Twenty years ago, operating a data center -- encompassing cooling systems and other infrastructure -- required as much energy as running the servers themselves. Today, operations use just 10% of what the servers consume, says Gareth Williams from consulting firm Arup. This is largely through this focus on energy efficiency. Many data centers now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly. This allows them to optimize water and electricity use in real-time, according to McKinsey's Pankaj Sachdeva. For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers. "All the big players are looking at it," Williams said. This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago. Amazon's world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers -- - avoiding having to rebuild existing data centers. "There simply wouldn't be enough liquid-cooling capacity to support our scale," Dave Brown, vice president of compute and machine learning services at AWS, said in a YouTube video. US vs. China For McKinsey's Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last. Research by Purdue University's Yi Ding has shown that AI chips can last longer without losing performance. "But it's hard to convince semiconductor companies to make less money" by encouraging customers to keep using the same equipment longer, Ding added. Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won't reduce total energy consumption. "Energy consumption will keep rising," Ding predicted, despite all efforts to limit it. "But maybe not as quickly." In the United States, energy is now seen as key to keeping the country's competitive edge over China in AI. In January, Chinese startup DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips -- and by extension, less energy. DeepSeek's engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential. China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.
[3]
Ex-Google CEO: 'AI's natural limit is electricity, not chips'
It seems every company under the sun these days is leveraging, investing in, or using AI in some way or another. The value of artificial intelligence -- automating repetitive tasks, boosting efficiency, and solving extremely complex problems -- has Wall Street salivating. But it's superintelligence, not AI, that has Silicon Valley atwitter -- and it's why some of the biggest companies, including Mark Zuckerberg's Meta and Sam Altman's OpenAI, are warring over AI talent. All the dominant tech players want to be the first to build intelligence that "greatly exceeds the cognitive performance of humans in virtually all domains of interest," according to University of Oxford researcher Nick Bostrom's book, Superintelligence: Paths, Dangers, Strategies. "Superintelligence is intelligence beyond the sum of all humans," Eric Schmidt, former CEO and chairman of Google, wrote in a LinkedIn post Thursday. "It is reasonable to predict that we are going to have specialized AI savants in every field within five years. Now imagine their capabilities and how they will change society and our day-to-day lives." Schmidt, who spoke with Peter Diamandis and Dave Blundin in a new episode of their Moonshots podcast published Thursday, spoke about the most complex limiting factor. Hint: It's not money -- and it's not semiconductors, either. "AI's natural limit is electricity, not chips," Schmidt said. "The U.S. is currently expected to need another 92 gigawatts of power to support the AI revolution. For reference, one gigawatt is roughly the equivalent of one nuclear power station. Right now, there are essentially none of these facilities being built, and in the last 30 years, only two have been constructed," he added. Silicon Valley giants are working to resurrect and retrofit old power plants to help power their AI needs. Microsoft, for one, struck a 20-year power purchase agreement with Constellation Energy to restart Three Mile Island, which closed in 2019, targeting a relaunch in 2028. But even now, Microsoft is using a ton of resources for AI: In its latest environmental report, the Windows maker said it increased its water use between 2021 and 2022 by 34%, to around 1.7 billion gallons, which outside experts largely tied to AI. And researchers believe global AI workloads may use 4.2 to 6.6 billion cubic meters of water by 2027 -- enough to fill anywhere from 1.7 to 2.6 million Olympic-sized swimming pools. Put another way, that's enough water to supply the entire population of Canada for more than a year. OpenAI CEO Sam Altman said last year an energy breakthrough "is essential for AI's future." (Altman, for what it's worth, has personally invested in Helion, a startup working on nuclear fusion, and backed its 2028 pilot plant.) In May, companies like Microsoft and AMD urged U.S. senators to fast-track permits to avoid wearing down the grid due to AI's high-energy demands. Critics like Greenpeace say at the current rate, AI usage risks derailing national and global climate goals. "We don't know what AI will deliver, and we certainly don't know what superintelligence will bring, but we know that it is coming fast," Schmidt said. "We need to plan ahead to ensure we have the energy needed to meet the many opportunities and challenges that AI puts before us." You can watch Schmidt's full conversation with Diamandis and Blundin about what artificial superintelligence might actually look like here.
[4]
Tech giants scramble to meet AI's looming energy crisis
New York (AFP) - The artificial intelligence industry is scrambling to reduce its massive energy consumption through better cooling systems, more efficient computer chips, and smarter programming -- all while AI usage explodes worldwide. AI depends entirely on data centers, which could consume three percent of the world's electricity by 2030, according to the International Energy Agency. That's double what they use today. Experts at McKinsey, a US consulting firm, describe a race to build enough data centers to keep up with AI's rapid growth, while warning that the world is heading toward an electricity shortage. "There are several ways of solving the problem," explained Mosharaf Chowdhury, a University of Michigan professor of computer science. Companies can either build more energy supply -- which takes time and the AI giants are already scouring the globe to do -- or figure out how to consume less energy for the same computing power. Chowdhury believes the challenge can be met with "clever" solutions at every level, from the physical hardware to the AI software itself. For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30 percent. 'Clever' solutions Twenty years ago, operating a data center -- encompassing cooling systems and other infrastructure -- required as much energy as running the servers themselves. Today, operations use just 10 percent of what the servers consume, says Gareth Williams from consulting firm Arup. This is largely through this focus on energy efficiency. Many data centers now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly. This allows them to optimize water and electricity use in real-time, according to McKinsey's Pankaj Sachdeva. For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers. "All the big players are looking at it," Williams said. This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago. Amazon's world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers - - avoiding have to rebuild existing data centers. "There simply wouldn't be enough liquid-cooling capacity to support our scale," Dave Brown, vice president of compute and machine learning services at AWS, said in a YouTube video. US vs China For McKinsey's Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last. Research by Purdue University's Yi Ding has shown that AI chips can last longer without losing performance. "But it's hard to convince semiconductor companies to make less money" by encouraging customers to keep using the same equipment longer, Ding added. Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won't reduce total energy consumption. "Energy consumption will keep rising," Ding predicted, despite all efforts to limit it. "But maybe not as quickly." In the United States, energy is now seen as key to keeping the country's competitive edge over China in AI. In January, Chinese startup DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips -- and by extension, less energy. DeepSeek's engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential. China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.
[5]
Former Google CEO Eric Schmidt warns of AI superintelligence outpacing Earth's energy limits: 'Chips will outrun power needs'
AI's future may be limited not by chips, but by the power to run them. Eric Schmidt highlights how data centers fueling AI models are consuming record amounts of water and electricity, risking an environmental crisis. As big tech races toward superintelligence, the looming question is whether our energy grid can handle the load. As the world marvels at the rapid evolution of artificial intelligence -- writing code, diagnosing illnesses, even composing symphonies -- an unexpected crisis is taking shape behind the scenes. The real limit to AI's growth, it turns out, may not be algorithms or microchips but something far more elemental: electricity. In a striking episode of the Moonshots podcast, former Google CEO Eric Schmidt offered a sobering assessment of the future of AI. "AI's natural limit is electricity, not chips," he declared. Schmidt, who now chairs the Special Competitive Studies Project, a pro-AI think tank, explained that the U.S. alone may need an additional 92 gigawatts of power to sustain its AI ambitions -- a demand equivalent to building 92 nuclear power plants. For perspective, only two such plants have been constructed in the U.S. over the past three decades. As companies like OpenAI, Microsoft, Meta, and Google sprint toward artificial general intelligence (AGI) -- machines with reasoning capabilities that rival or surpass human intelligence -- their growing appetite for energy is becoming impossible to ignore. "We need energy in all forms... and we need it quickly," Schmidt emphasized during a recent testimony before Congress. This is not just a theoretical concern. Microsoft has already signed a 20-year nuclear power deal to revive the shuttered Three Mile Island facility, while Sam Altman of OpenAI has invested heavily in Helion, a fusion energy startup. Meanwhile, tech companies are snapping up water rights and power contracts in a desperate bid to keep their servers cool and their models humming. In fact according to a report from Quartz, Microsoft's 2023 environmental report revealed a 34% spike in water use, totaling 1.7 billion gallons -- just to cool its AI-driven data centers. By 2027, AI workloads could require enough water to serve all of Canada for a year, according to researchers. This surge in energy and resource consumption is igniting broader fears. Environmental groups like Greenpeace warn that AI's unchecked growth could derail national and international climate goals. And yet, the lure of "superintelligence" -- AI so advanced it could transform medicine, law, defense, and scientific research -- is too great for companies and investors to resist. "We don't know what AGI or superintelligence will ultimately deliver," Schmidt admitted, "but we know it's coming. And we must plan now to make sure we have the energy infrastructure to support it." The tension is real. On one hand, AI promises to solve global challenges. On the other, its development could strain -- and possibly break -- the very systems it aims to improve. The irony is poignant: machines designed to think like humans may one day need more power than humanity can afford to give. AI has long been portrayed as the brain of the future. But Eric Schmidt's warning makes it clear: without electricity, there's no intelligence -- artificial or otherwise. As society edges closer to superintelligence, perhaps the more pressing question isn't how smart our machines will become, but whether we'll have enough power to keep them running.
Share
Copy Link
As tech giants rush towards artificial superintelligence, experts warn that electricity, not computing power, may be the limiting factor. The AI industry's massive energy consumption is raising concerns about environmental impact and grid capacity.
As the artificial intelligence (AI) industry races towards superintelligence, a new challenge is emerging that could potentially limit its growth: electricity. Former Google CEO Eric Schmidt has warned that "AI's natural limit is electricity, not chips," highlighting the massive energy requirements of advanced AI systems 1.
Source: Economic Times
The United States alone is expected to need an additional 92 gigawatts of power to support the AI revolution, equivalent to building 92 new nuclear power stations. This presents a significant challenge, given that only two such facilities have been constructed in the last three decades 2.
AI's dependence on data centers is a key factor in its energy consumption. According to the International Energy Agency, data centers could consume 3% of the world's electricity by 2030, double their current usage 3.
Tech giants are already taking steps to secure massive energy contracts. Microsoft, for instance, has signed a 20-year power purchase agreement with Constellation Energy to restart the Three Mile Island nuclear plant, which was shuttered in 2019 4.
The environmental impact of AI extends beyond electricity consumption. Microsoft's latest environmental report revealed a 34% increase in water consumption, totaling 1.7 billion gallons in a single year, primarily used for cooling servers and maintaining data centers 5.
Researchers estimate that by 2027, AI workloads could consume up to 6.6 billion cubic meters of water, enough to supply all of Canada for over a year. This massive water usage is raising alarms among climate groups, who warn that AI's growth could derail national and global climate goals 1.
Source: Tech Xplore
The tech industry is scrambling to address these energy challenges through various means:
Improved Cooling Systems: Many data centers now use AI-powered sensors to control temperature in specific zones, optimizing water and electricity use in real-time 3.
Liquid Cooling: Companies are exploring liquid cooling methods to replace energy-hungry air conditioners. Amazon's AWS has developed its own liquid cooling system for Nvidia GPUs in its servers 2.
Energy-Efficient Chips: Each new generation of computer chips is becoming more energy-efficient, potentially helping to mitigate some of the energy concerns 3.
Smarter Programming: Researchers are developing algorithms that can calculate exactly how much electricity each AI chip needs, potentially reducing energy use by 20-30% 2.
Despite these challenges, the race towards superintelligence continues. Schmidt predicts that specialized AI experts in various fields could emerge within five years 4. Companies like OpenAI, Meta, and Microsoft are competing fiercely to be the first to achieve this milestone.
However, as Sam Altman, CEO of OpenAI, acknowledged, "An energy breakthrough is essential for AI's future." Altman has personally invested in Helion, a nuclear fusion startup aiming to build a pilot plant by 2028 1.
Source: Quartz
The energy challenge is also becoming a factor in global AI competition. In the United States, energy is now seen as key to maintaining a competitive edge over China in AI development. China is reportedly ahead in available energy sources, including renewables and nuclear 3.
In response to these challenges, tech companies are urging policymakers to act. In May, Microsoft and AMD called on Congress to fast-track permits for new energy projects to avoid overwhelming the U.S. power grid 1.
As the AI industry continues its rapid growth, balancing the pursuit of superintelligence with sustainable energy practices will be crucial. The coming years will likely see increased focus on energy innovations and policy measures to address this looming challenge.
Summarized by
Navi
[2]
Google launches its new Pixel 10 smartphone series, showcasing advanced AI capabilities powered by Gemini, aiming to challenge competitors in the premium handset market.
20 Sources
Technology
7 hrs ago
20 Sources
Technology
7 hrs ago
Google's Pixel 10 series introduces groundbreaking AI features, including Magic Cue, Camera Coach, and Voice Translate, powered by the new Tensor G5 chip and Gemini Nano model.
12 Sources
Technology
7 hrs ago
12 Sources
Technology
7 hrs ago
NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather with improved accuracy, potentially helping to protect Earth's infrastructure from solar storm damage.
6 Sources
Technology
15 hrs ago
6 Sources
Technology
15 hrs ago
Google's latest smartwatch, the Pixel Watch 4, introduces significant upgrades including a curved display, enhanced AI features, and improved health tracking capabilities.
17 Sources
Technology
7 hrs ago
17 Sources
Technology
7 hrs ago
FieldAI, a robotics startup, has raised $405 million to develop "foundational embodied AI models" for various robot types. The company's innovative approach integrates physics principles into AI, enabling safer and more adaptable robot operations across diverse environments.
7 Sources
Technology
7 hrs ago
7 Sources
Technology
7 hrs ago