4 Sources
[1]
Confronting the AI/energy conundrum
Caption: (From left to right:) Moderator Elsa Olivetti of MIT explores opportunities to reduce data center demand with panelists Dustin Demetriou of IBM, Emma Strubell of Carnegie Mellon University, and Vijay Gadepally of the MIT Lincoln Laboratory Supercomputing Center. The explosive growth of AI-powered computing centers is creating an unprecedented surge in electricity demand that threatens to overwhelm power grids and derail climate goals. At the same time, artificial intelligence technologies could revolutionize energy systems, accelerating the transition to clean power. "We're at a cusp of potentially gigantic change throughout the economy," said William H. Green, director of the MIT Energy Initiative (MITEI) and Hoyt C. Hottel Professor in the MIT Department of Chemical Engineering, at MITEI's Spring Symposium, "AI and energy: Peril and promise," held on May 13. The event brought together experts from industry, academia, and government to explore solutions to what Green described as both "local problems with electric supply and meeting our clean energy targets" while seeking to "reap the benefits of AI without some of the harms." The challenge of data center energy demand and potential benefits of AI to the energy transition is a research priority for MITEI. AI's startling energy demands From the start, the symposium highlighted sobering statistics about AI's appetite for electricity. After decades of flat electricity demand in the United States, computing centers now consume approximately 4 percent of the nation's electricity. Although there is great uncertainty, some projections suggest this demand could rise to 12-15 percent by 2030, largely driven by artificial intelligence applications. Vijay Gadepally, senior scientist at MIT's Lincoln Laboratory, emphasized the scale of AI's consumption. "The power required for sustaining some of these large models is doubling almost every three months," he noted. "A single ChatGPT conversation uses as much electricity as charging your phone, and generating an image consumes about a bottle of water for cooling." Facilities requiring 50 to 100 megawatts of power are emerging rapidly across the United States and globally, driven both by casual and institutional research needs relying on large language programs such as ChatGPT and Gemini. Gadepally cited congressional testimony by Sam Altman, CEO of OpenAI, highlighting how fundamental this relationship has become: "The cost of intelligence, the cost of AI, will converge to the cost of energy." "The energy demands of AI are a significant challenge, but we also have an opportunity to harness these vast computational capabilities to contribute to climate change solutions," said Evelyn Wang, MIT vice president for energy and climate and the former director at the Advanced Research Projects Agency-Energy (ARPA-E) at the U.S. Department of Energy. Wang also noted that innovations developed for AI and data centers -- such as efficiency, cooling technologies, and clean-power solutions -- could have broad applications beyond computing facilities themselves. Strategies for clean energy solutions The symposium explored multiple pathways to address the AI-energy challenge. Some panelists presented models suggesting that while artificial intelligence may increase emissions in the short term, its optimization capabilities could enable substantial emissions reductions after 2030 through more efficient power systems and accelerated clean technology development. Research shows regional variations in the cost of powering computing centers with clean electricity, according to Emre Gençer, co-founder and CEO of Sesame Sustainability and former MITEI principal research scientist. Gençer's analysis revealed that the central United States offers considerably lower costs due to complementary solar and wind resources. However, achieving zero-emission power would require massive battery deployments -- five to 10 times more than moderate carbon scenarios -- driving costs two to three times higher. "If we want to do zero emissions with reliable power, we need technologies other than renewables and batteries, which will be too expensive," Gençer said. He pointed to "long-duration storage technologies, small modular reactors, geothermal, or hybrid approaches" as necessary complements. Because of data center energy demand, there is renewed interest in nuclear power, noted Kathryn Biegel, manager of R&D and corporate strategy at Constellation Energy, adding that her company is restarting the reactor at the former Three Mile Island site, now called the "Crane Clean Energy Center," to meet this demand. "The data center space has become a major, major priority for Constellation," she said, emphasizing how their needs for both reliability and carbon-free electricity are reshaping the power industry. Can AI accelerate the energy transition? Artificial intelligence could dramatically improve power systems, according to Priya Donti, assistant professor and the Silverman Family Career Development Professor in MIT's Department of Electrical Engineering and Computer Science and the Laboratory for Information and Decision Systems. She showcased how AI can accelerate power grid optimization by embedding physics-based constraints into neural networks, potentially solving complex power flow problems at "10 times, or even greater, speed compared to your traditional models." AI is already reducing carbon emissions, according to examples shared by Antonia Gawel, global director of sustainability and partnerships at Google. Google Maps' fuel-efficient routing feature has "helped to prevent more than 2.9 million metric tons of GHG [greenhouse gas] emissions reductions since launch, which is the equivalent of taking 650,000 fuel-based cars off the road for a year," she said. Another Google research project uses artificial intelligence to help pilots avoid creating contrails, which represent about 1 percent of global warming impact. AI's potential to speed materials discovery for power applications was highlighted by Rafael Gómez-Bombarelli, the Paul M. Cook Career Development Associate Professor in the MIT Department of Materials Science and Engineering. "AI-supervised models can be trained to go from structure to property," he noted, enabling the development of materials crucial for both computing and efficiency. Securing growth with sustainability Throughout the symposium, participants grappled with balancing rapid AI deployment against environmental impacts. While AI training receives most attention, Dustin Demetriou, senior technical staff member in sustainability and data center innovation at IBM, quoted a World Economic Forum article that suggested that "80 percent of the environmental footprint is estimated to be due to inferencing." Demetriou emphasized the need for efficiency across all artificial intelligence applications. Jevons' paradox, where "efficiency gains tend to increase overall resource consumption rather than decrease it" is another factor to consider, cautioned Emma Strubell, the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University. Strubell advocated for viewing computing center electricity as a limited resource requiring thoughtful allocation across different applications. Several presenters discussed novel approaches for integrating renewable sources with existing grid infrastructure, including potential hybrid solutions that combine clean installations with existing natural gas plants that have valuable grid connections already in place. These approaches could provide substantial clean capacity across the United States at reasonable costs while minimizing reliability impacts. Navigating the AI-energy paradox The symposium highlighted MIT's central role in developing solutions to the AI-electricity challenge. Green spoke of a new MITEI program on computing centers, power, and computation that will operate alongside the comprehensive spread of MIT Climate Project research. "We're going to try to tackle a very complicated problem all the way from the power sources through the actual algorithms that deliver value to the customers -- in a way that's going to be acceptable to all the stakeholders and really meet all the needs," Green said. Participants in the symposium were polled about priorities for MIT's research by Randall Field, MITEI director of research. The real-time results ranked "data center and grid integration issues" as the top priority, followed by "AI for accelerated discovery of advanced materials for energy." In addition, attendees revealed that most view AI's potential regarding power as a "promise," rather than a "peril," although a considerable portion remain uncertain about the ultimate impact. When asked about priorities in power supply for computing facilities, half of the respondents selected carbon intensity as their top concern, with reliability and cost following.
[2]
OpenAI's gargantuan data center is even bigger than Elon Musk's xAI Colossus -- world's largest 300 MW AI data center in Texas could reach record 1 gigawatt scale by next year
Elon Musk's xAI made quite a splash when it built a data center with 200,000 GPUs that consumes approximately 250 MW of power. However, it appears that OpenAI has an even larger data center in Texas, which consumes 300 MW and houses hundreds of thousands of AI GPUs, details of which were not disclosed. Furthermore, the company is expanding the site, and by mid-2026, it aims to reach a gigawatt scale, according to SemiAnalysis. Such gargantuan AI clusters are creating challenges for power companies not only in power generation but also in power grid safety. OpenAI appears to operate what is described as the world's largest single data center building, with an IT load capacity of around 300 MW and a maximum power capacity of approximately 500 MW. This facility includes 210 air-cooled substations and a massive on-site electrical substation, which further highlights its immense scale. A second identical building is already under construction on the same site as of January 2025. When completed, this expansion will bring the total capacity of the campus to around a gigawatt, a record. These developments have drawn attention from the Electric Reliability Council of Texas (ERCOT), the organization responsible for overseeing the Texas power grid, because of the unprecedented size and energy demand of such sites. The power consumption profile of these data centers, combined with their rapid growth, presents serious challenges for energy supply companies for several reasons. Firstly, hundreds of thousands of AI accelerators (such as Nvidia's H100 or B200) and servers on their base consume an immense amount of power and require a huge and continuous supply of electricity, often equivalent to what a mid-sized city consumes. Supplying this kind of load forces power companies to build or upgrade substations, transmission lines, and generation capacity far faster than usual. This stretches both financial and physical infrastructure planning, especially in regions that were not prepared for such rapid growth. Secondly, the way these data centers use power is unstable. Unlike traditional factories or office buildings that draw power steadily, AI-focused data centers can swing from maximum demand to minimal usage in moments. This kind of behavior places enormous stress on grid management, as even slight imbalances between supply and demand can cause voltage and frequency issues. Specifically, when more electricity is produced than needed, both voltage and frequency rise above their normal levels. If demand outpaces supply, they drop below standard values. Even a 10% deviation in either direction can damage electronics or trigger circuit protection. It is the grid operator's responsibility to keep these parameters within safe limits to ensure system stability. However, if several large data centers (or one giant data center, such as the one used by OpenAI) suddenly reduce their power draw, it could send shockwaves through the rest of the grid, causing other power consumers or generators to shut down, and potentially triggering a chain of failures. Thirdly, integrating these data centers into the grid requires complex coordination with regional planning authorities, which typically conduct studies to understand the effects on transmission stability and to prevent conflicts with other grid users. However, these studies are time-consuming and often lag behind the speed at which data centers are built. Finally, there is an economic challenge as power companies may need to spend billions to satisfy the demands of large data centers. However, the unpredictable nature of the AI industry means return on that investment is hard to model. At the same time, if the grid is not upgraded fast enough, there is a risk of blackouts or turning down industrial customers who cannot compete for limited grid capacity.
[3]
AI's power needs could short-circuit US infrastructure
Power required by AI datacenters in the US may be more than 30 times greater in a decade, with 5 GW facilities already in the pipeline.. A Deloitte Insights report, "Can US infrastructure keep up with the AI economy?" paints a picture of ever larger datacenters burning ever more energy, while the grid infrastructure and new power sources are stuck in the slow lane due to factors such as bureaucracy and supply chain disruption. Potential solutions to this problem hinge on hopes for tech innovation making AI infrastructure more energy efficient, regulatory changes, and a massive funding injection. By now, anyone in the tech industry is likely aware that all the hype surrounding AI is driving a boom in infrastructure-building to develop and deploy ever more sophisticated AI models. Deloitte estimates that US datacenters last year consumed about 33 GW of energy in total, with AI facilities accounting for 4 GW of that (about one-eighth). Looking ahead to 2035, it forecasts that the power required by these facilities will be five times larger, yet the AI datacenter power draw will increase more than thirtyfold to 123 GW, accounting for 70 percent of the 176 GW total. And there won't be just more bit barns, they'll also be bigger. The largest US datacenters operated by the big three hyperscalers currently draw less than 500 MW, according to Deloitte, but some under construction now are likely to top 2 GW. There are also 50,000-acre datacenter campuses now in the early stages of the planning process that could consume as much as 5 GW, or about as much as 5 million homes (yes, we know some people dispute that kind of calculation). In short, demand for energy is only going to go up. And much of the increase in demand-serving datacenter growth over the past year has primarily been met with more gas-fired electricity generation, Deloitte says, despite the much-vaunted clean energy and net-zero goals of the hyperscale operators. The report warns there is currently a seven-year wait on some requests for connection to the grid, and power generation development typically takes longer than datacenter buildouts. The latter can be completed in a few years, while gas power plant projects without existing equipment contracts are not expected to become available until the next decade. Supply chain issues are constraining energy companies and hyperscalers alike, as many critical components are imported and now subject to tariffs. Both industries face potential increases in the cost of steel, aluminum, copper, and cement that could impact the build-out of power infrastructure. Deloitte surveyed datacenter operators and power companies to ask what strategies could overcome the potential energy gap caused by the bit barn build boom. The three top responses were technological innovation, regulatory changes, and more funding. The technological innovation approach is basically hoping that the industry can develop more power-efficient infrastructure, such as switching to optical data transmission in server rooms or using solid-state transformers within the grid. Regulatory reform might involve changing priorities and scrapping speculative projects to streamline the process, moving grid operators to a "first-ready, first-served" approach. The UK has faced some similar issues, which saw energy regulator Ofgem bring in a revised queue management system designed to expunge "zombie projects" and accelerate the connection process for viable ones. Ultimately, it boils down to a need for more money. "Developing additive infrastructure will require massive funding across all of the industries involved," Deloitte states. It notes that the impact of AI datacenter development is already becoming more apparent in investment discussions, opening the door to new funding opportunities. The potential risks of failing to come up with a solution are that power and grid capacity constraints could hamstring AI advancement in the US, which would be just awful, and power companies could miss an opportunity to expand and modernize the grid. "These developments could jeopardize US economic and geopolitical leadership. Staking an infrastructural lead in powering AI may now be a matter of competitiveness and even national security," Deloitte concludes. ®
[4]
AI's energy demands are surging - the grid needs to catch up
As AI models grow larger and more capable, the supporting infrastructure must evolve in tandem. AI's insatiable appetite has Big Tech going as far as restarting nuclear power plants to support massive new datacenters, which today account for as much as 2% of global electricity consumption, or more than the entire country of Germany. But the humble power grid is where we need to start. Constructing the computing superstructure to support AI tools will significantly alter the demand curve for energy and put increasing strain on electrical grids. As AI embraces more complex workloads across both training and inference, compute needs - and thereby power consumption - are expected to increase exponentially. Some forecasts suggest that datacenter electricity consumption could increase to as much as 12% of the global total by 2030. Semiconductors form the cornerstone of AI computing infrastructure. The chipmaking industry has focused primarily on expanding renewable energy sources and delivering improvements in energy-efficient computing technologies. These are necessary but not sufficient - they cannot sustainably support the enormous energy requirements demanded by the growth of AI. We need to build a more resilient power grid. In a new report, we call for a different paradigm - sustainable energy abundance - which will be achieved not by sacrificing growth, but by constructing a holistic energy strategy to power the next generation of computing. The report represents the work of major companies across the AI technology stack, from chip design and manufacturing to cloud service providers, as well as thought leaders from the energy and finance sectors. The foundational pillar of this new strategy is grid decarbonization. Although not a new concept, in the AI era it requires an approach that integrates decarbonization with energy abundance, ensuring AI's productivity gains are not sidelined by grid constraints. In practical terms, this entails embracing traditional energy sources like oil and gas, while gradually transitioning toward cleaner sources such as nuclear, hydro, geothermal, solar and wind. Doing this effectively requires understanding of the upgrades needed for the electricity grid to enable rapid integration of existing and new energy sources. Consuming electricity from the grid naturally assumes the emissions profile of the grid itself. It should come as no surprise that emissions related to the grid represent the single biggest component of the emissions bill facing any given company. In the conventional approach to sustainability, companies focused more on offsetting emissions derived from the grid rather than sourcing the grid with cleaner (or carbon-free) energy. To support the coming scale-out of AI infrastructure, access to a clean grid will be one of the most important aspects in reducing carbon footprint. Strategically selecting locations for datacenters and semiconductor fabs will be critical. Countries and regions have a varying mix of clean energy in the power grid, which impacts their carbon emission profile. For example, the United States and France generate a similar percentage of their overall electricity from renewable sources. However, the United States has a significantly higher country emission factor, which represents the direct carbon emission per kilowatt-hour of electricity generated. This is because most of the electricity in France is generated through nuclear power, while the United States still gets a significant percentage of electricity supplied through coal and natural gas. Likewise, there could be significant differences within a country such as the United States, with states like California having a higher mix of renewables compared to some other states. A truly resilient grid strategy must start with expanded capacity for nuclear, wind, solar, and traditional forms of energy, while driving a mix shift to cleaner sources over time. However, to achieve this enhanced capacity, it will be necessary to invest in disruptive innovations. Transmission infrastructure must be modernized, including upgraded lines, substations and control systems. Likewise, the industry must take advantage of smart distribution technologies, deploying digital sensors and AI-driven load management techniques. Semiconductors have an important role to play. Continued growth of GPUs and other accelerators will drive corresponding growth in datacenter power semiconductors, along with increasing semiconductor content in other components such as the motherboard and the power supply. We forecast that the datacenter power semiconductor market could reach $9 billion by 2030, driven by an increase in servers as well as the number of accelerators per server. Approximately $7 billion of the opportunity is driven by accelerators, with the rest coming from the power supply and other areas. As the technology matures, we believe gallium nitride will play an important role in this market, given its high efficiency. As the grid incorporates increasing levels of renewables, more semiconductors will be needed for energy generation. Silicon carbide will be important for solar generation and potentially wind as well. We estimate that renewable energy generation could grow to more than a $20 billion market for semiconductors by 2030. A similar opportunity exists for smart infrastructure such as meters, sensors and heat pumps. Restructuring the power grid offers the single biggest opportunity to deliver sustainable, abundant energy for AI. Modernizing the power grid will require complex industry partnerships and buy-in from company leadership. In the past, sustainability initiatives were largely regarded as a compliance checkbox item, with unclear ties to business results. A new playbook is needed to enable the growth of AI while shifting business incentives toward generation, transmission, distribution and storage of clean energy and modernization of the power grid. To truly harness the transformative productivity and prosperity potential of AI, we need a comprehensive sustainability strategy that expands clean energy capacity, modernizes energy infrastructure, and maintains diverse energy generation sources to ensure stable, abundant power for continued technological innovation. When combined with progress in energy-efficient computing and abatement measures, this holistic approach can realistically accelerate the pursuit of sustainability while mitigating the risk of curtailing growth due to insufficient energy resources. We list the best IT infrastructure management service.
Share
Copy Link
The explosive growth of AI computing centers is creating unprecedented electricity demand, threatening power grids and climate goals. However, AI technologies could also revolutionize energy systems, accelerating the transition to clean power.
The rapid expansion of artificial intelligence (AI) is creating an unprecedented surge in electricity demand, posing significant challenges for power grids and climate goals. Experts from industry, academia, and government are grappling with the dual challenge of meeting AI's energy needs while harnessing its potential to revolutionize energy systems 1.
Source: Tom's Hardware
AI's appetite for electricity is growing at an alarming rate. In the United States, computing centers now consume approximately 4% of the nation's electricity, with projections suggesting this could rise to 12-15% by 2030 1. The power required for sustaining large AI models is doubling almost every three months, with a single ChatGPT conversation consuming as much electricity as charging a phone 1.
The AI boom is driving the construction of increasingly large data centers. OpenAI operates what is described as the world's largest single data center building, with an IT load capacity of around 300 MW and a maximum power capacity of approximately 500 MW 2. Even more staggering, there are 50,000-acre datacenter campuses in the early planning stages that could consume as much as 5 GW, equivalent to the power consumption of about 5 million homes 3.
Source: The Register
The rapid growth of AI data centers is creating significant challenges for power companies and grid operators:
Unprecedented Demand: The sheer scale of power consumption by AI facilities is forcing power companies to build or upgrade infrastructure at an unprecedented pace 2.
Unstable Power Usage: AI data centers can swing from maximum demand to minimal usage in moments, placing enormous stress on grid management 2.
Planning and Coordination Challenges: Integrating these data centers into the grid requires complex coordination with regional planning authorities, often lagging behind the speed of data center construction 2.
To address the AI-energy challenge, experts are exploring multiple pathways:
Regional Variations: Research shows regional variations in the cost of powering computing centers with clean electricity, with the central United States offering lower costs due to complementary solar and wind resources 1.
Nuclear Power: There is renewed interest in nuclear power, with companies like Constellation Energy restarting reactors to meet data center demand 1.
Grid Modernization: Experts call for a holistic energy strategy, including grid decarbonization, embracing traditional energy sources while transitioning to cleaner alternatives, and upgrading transmission infrastructure 4.
While AI poses challenges for energy systems, it also offers potential solutions:
Grid Optimization: AI can accelerate power grid optimization, potentially solving complex power flow problems at significantly faster speeds than traditional models 1.
Emissions Reduction: AI applications are already contributing to carbon emissions reduction, such as Google Maps' fuel-efficient routing feature 1.
Semiconductor Advancements: The growth of AI is driving innovations in semiconductor technology, which could lead to more energy-efficient computing and power management solutions 4.
As the AI revolution continues to unfold, balancing its energy demands with sustainable practices and grid resilience will be crucial for maintaining technological progress while meeting climate goals.
Google introduces a series of AI agents and tools to revolutionize data engineering, data science, and analytics, promising to streamline workflows and boost productivity for enterprise data teams.
3 Sources
Technology
21 hrs ago
3 Sources
Technology
21 hrs ago
Qualcomm announces successful testing of OpenAI's gpt-oss-20b model on Snapdragon-powered devices, marking a significant step towards on-device AI processing.
2 Sources
Technology
21 hrs ago
2 Sources
Technology
21 hrs ago
Huawei is open-sourcing its CANN software toolkit for Ascend AI GPUs, aiming to compete with NVIDIA's CUDA and attract more developers to its ecosystem.
2 Sources
Technology
21 hrs ago
2 Sources
Technology
21 hrs ago
Anthropic's Claude AI model has demonstrated exceptional performance in hacking competitions, outranking human competitors and raising questions about the future of AI in cybersecurity.
2 Sources
Technology
13 hrs ago
2 Sources
Technology
13 hrs ago
The Productivity Commission's proposal for AI copyright exemptions in Australia has ignited a fierce debate between tech companies and creative industries, raising concerns about intellectual property rights and economic impact.
3 Sources
Policy and Regulation
13 hrs ago
3 Sources
Policy and Regulation
13 hrs ago