5 Sources
[1]
AI experts warn that China is miles ahead of the US in electricity generation -- lack of supply and infrastructure threatens the US's long-term AI plans
AI data centers demand gigawatts of power that the U.S. grid is struggling to supply. A U.S. analyst of Chinese technology said that the country has already solved its energy problem -- at least in terms of power for its AI infrastructure. Rui Ma, founder of Tech Buzz China, posted on X that the country's massive investments in advanced hydropower and nuclear technologies meant that its "electricity supply is secure and inexpensive." This is in contrast to the U.S., where many AI data centers are disrupting its electricity grid and supply, resulting in a lack of supply and price increases for every user. Both Washington and Beijing are currently in an AI race, with the two powers vying for the lead in this technology. Because of this, the two rivals are diving into a massive build-out of AI data centers that require massive amounts of electricity to run. In the U.S., it has come to the point that tech giants are building their own power plants -- with Elon Musk importing one to power his data centers and companies, like Microsoft, Google, Amazon, Oracle, Nvidia, and more investing in the research and development of nuclear reactors. However, it seems that this is not a problem for China. According to Fortune, the East Asian country has an 80% to 100% power reserve, allowing it to absorb the massive demand brought about by the hundreds of data centers it built in recent years. More than that, it's also continually expanding its output, with one expert telling the publication that it "adds more electricity demand than the entire annual consumption of Germany, every single year." Some argue that the power is delivered by heavily polluting coal plants, but China is also investing massively in renewable energy projects. Nevertheless, if the power demand outstrips supply, it can easily reactivate coal plants to cover the shortfall. In fact, the new data centers are welcomed, as they help stimulate demand in a market that has an excess of power production. Nevertheless, electricity oversupply doesn't seem to be an immediate concern, as most of China's power plants are state-owned. Beijing also plans its energy production well in advance, allowing it to prepare for prospective demand, like the AI data center boom. This still does not address the elephant in the room, though: the fact that many Chinese data centers sit idle or underutilized. Beijing is developing a network to create a marketplace that will sell surplus capacity, but it is still facing challenges, especially with latency and different ecosystems. On the other hand, the U.S. faces major hurdles with its electricity supply. Meta founder Mark Zuckerberg said that power constraints will limit AI growth, and that new power plants aren't being built fast enough to satisfy AI's insatiable demand. If the U.S. does not address this issue sooner, it risks lagging behind China even if it has more powerful and efficient hardware. That's because the latter can just throw tons of power to gain the upper hand in the AI race through sheer brute force, similar to how Huawei's CloudMatrix cluster beats the performance of Nvidia's GB200.
[2]
The US May Have Already Lost the AI Race to China Due to a Key Weakness
You might have heard of an "AI race" heating up between the US and China, a bitter rivalry between two global adversaries that could shape the direction of world history. At least, that's how some in the US feel. While China has repeatedly tried to establish a geopolitical friendship with the richest nation in the world, officials and pundits in the US have doubled down, reframing artificial intelligence as the 21st century's nuclear bomb. In the meantime, China may have gotten a massive lead -- by actively investing in its power grid, while the United States' is quickly running out of capacity to power immensely power-hungry AI models. As Fortune reports, Americans who've had a look at China's technological development firsthand found that the two country's aren't even in the same league, given China's next-level power grid. "Energy is considered a solved problem," wrote Rui Ma, editor of the US publication Tech Buzz China. "Everywhere we went, people treated energy availability as a given," she continued. "This is a stark contrast to the US, where AI growth is increasingly tied to debates over data center power consumption and grid limitations." AI is a notoriously energy-intensive technology. The data centers powering large language models like ChatGPT are immense labyrinths of computer chips, which suck down resources like power and water in order to keep up with demand. As Fortune notes, this effectively makes electricity the key bottleneck for expanding AI infrastructure. That's caused some critical shortages in the US. Short on energy and hopped up on fantasies of an arms race, American companies are resorting to all kinds of bizarre strategies to get their juice. Elon Musk's xAI, for example, is running 35 portable methane gas generators in the parking lot of one of its main datacenters in Memphis, encircling nearby communities in a cloud of noxious smog. China has no such problems. In 2024, China was responsible for nearly 65 percent of the world's renewable energy construction, installing so many solar panels and wind turbines that it caused the country's CO2 emissions to drop for the first time -- despite record-high demands for energy. Whether or not the US can catch up remains to be seen. President Donald Trump previously made an off-the-cuff remark about attaching coal power plants to data centers directly. It's an unfortunate conundrum in an age when energy demand in the US has never been higher. In the meantime, China keeps chugging along, seemingly unperturbed by any energy bottlenecks -- and the Trump administration's posturing.
[3]
AI experts return from China stunned: The U.S. grid is so weak, the race may already be over
"Everywhere we went, people treated energy availability as a given," Rui Ma wrote on X after returning from a recent tour of China's AI hubs. For American AI researchers, that's almost unimaginable. In the U.S., surging AI demand is colliding with a fragile power grid, the kind of extreme bottleneck that Goldman Sachs warns could severely choke the industry's growth. In China, Ma continued, it's considered a "solved problem." Ma, a renowned expert in Chinese technology and founder of the media company Tech Buzz China, took her team on the road to get a firsthand look at the country's AI advancements. She told Fortune that while she isn't an energy export, she attended enough meetings and talked to enough insiders to come away with a conclusion that should send chills down the spine of Silicon Valley: in China, building enough power for data centers is no longer up for debate. "This is a stark contrast to the U.S., where AI growth is increasingly tied to debates over data center power consumption and grid limitations," she wrote on X. The stakes are difficult to overstate. Data center building is the foundation of AI advancement, and spending on new centers now displaces consumer spending in terms of impact to U.S. GDP -- that's concerning since consumer spending is generally two-thirds of the pie. McKinsey projects that between 2025 and 2030, companies worldwide will need to invest $6.7 trillion into new data center capacity to keep up with AI's strain. In a recent research note, Stifel Nicolaus warned of a looming correction to the S&P 500, since it forecasts this data-center capex boom to be a one-off build-out of infrastructure, while consumer spending is clearly on the wane. However, the clear limiting factor to the U.S.'s data center infrastructure development, according to a Deloitte industry survey, is stress on the power grid. Cities' power grids are so weak that some companies are just building their own power plants rather than relying on existing grids. The public is growing increasingly frustrated over increasing energy bills - in Ohio, the electricity bill for a typical household has increased at least $15 this summer from the data centers - while energy companies prepare for a sea-change of surging demand. Goldman Sachs frames the crisis simply: "AI's insatiable power demand is outpacing the grid's decade-long development cycles, creating a critical bottleneck." Meanwhile, David Fishman, a Chinese electricity expert who has spent years tracking their energy development, told Fortune that in China, electricity isn't even a question. On average, China adds more electricity demand than the entire annual consumption of Germany, every single year. Whole rural provinces are blanketed in rooftop solar, with one province matching the entirety of India's electricity supply. "U.S. policymakers should be hoping China stays a competitor and not an aggressor," Fishman said. "Because right now they can't compete effectively on the energy infrastructure front." China's quiet electricity dominance, Fishman explained, is the result of decades of deliberate overbuilding and investment in every layer of the power sector, from generation to transmission to next-generation nuclear. The country's reserve margin has never dipped below 80%-100% nationwide, meaning it has consistently maintained at least twice the capacity it needs, Fishman said. They have so much available space that instead of seeing AI data centers as a threat to grid stability, China treats them as a convenient way to "soak up oversupply," he added. That level of cushion is unthinkable in the United States, where regional grids typically operate with a 15% reserve margin and sometimes less, particularly during extreme weather, Fishman said. In places like California or Texas, officials often issue warnings about red-flag conditions when demand is projected to strain the system. This leaves little room to absorb the rapid load increases AI infrastructure requires, Fishman ntoed. The gap in readiness is stark: while the U.S. is already experiencing political and economic fights over whether the grid can keep up, China is operating from a position of abundance. Even if AI demand in China grows so quickly renewable projects can't keep pace, Fishman said, the country can tap idle coal plants to bridge the gap while building more sustainable sources. "It's not preferable," he admitted, "but it's doable." By contrast, the U.S. would have to scramble to bring on new generation capacity, often facing years-long permitting delays, local opposition, and fragmented market rules, he said. Underpinning the hardware advantage is a difference in governance. In China, energy planning is coordinated by long-term, technocratic policy that defines the market's rules before investments are made, Fishman said. This model ensures infrastructure buildout happens in anticipation of demand, not in reaction to it. "They're set up to hit grand slams," Fishman noted. "The U.S., at best, can get on base." In the U.S., large-scale infrastructure projects depend heavily on private investment, but most investors expect a return within three to five years: far too short for power projects that can take a decade to build and pay off. "Capital is really biased toward shorter-term returns," he said, noting Silicon Valley has funneled billions into "the nth iteration of software-as-a-service" while energy projects fight for funding. In China, by contrast, the state directs money toward strategic sectors in advance of demand, accepting not every project will succeed but ensuring the capacity is in place when it's needed. Without public financing to de-risk long-term bets, he argued, the U.S. political and economic system is simply not set up to build the grid of the future. Cultural attitudes reinforce this approach. In China, renewables are framed as a cornerstone of the economy because they make sense economically and strategically, not because they carry moral weight. Coal use isn't cast as a sign of villainy, as it would be among some circles in the U.S. - it's simply seen as outdated. This pragmatic framing, Fishman argued, allows policymakers to focus on efficiency and results rather than political battles. For Fishman, the takeaway is blunt. Without a dramatic shift in how the U.S. builds and funds its energy infrastructure, China's lead will only widen. "The gap in capability is only going to continue to become more obvious -- and grow in the coming years," he said.
[4]
The U.S. AI Datacenter Boom Is So Massive That the Nation's Power Grid Is In a "Critical Bottleneck" Right Now; Big Tech is Forced to Build Its Power Plants
While the AI hype is advancing with full force, the industry seems set to face a massive issue due to America's old power grids, which aren't capable of handling data center expansion. Big Tech firms like Meta, Amazon, and Google are rapidly expanding their AI CapEX to fuel their computing needs and achieve superiority in producing the best AI models. However, in this pursuit of 'AI perfection,' there's a lot at stake, and one of the more worrying aspects is how massively the need for energy is growing in the US. And, when you factor in power grids that were created back in the 70s, you'll realize that America needs to think quick about its energy needs, or else the AI hype could prove to be devastating for them. Based on a report by Goldman Sachs, it is claimed that datacenter energy requirements are outpacing the US grid upgrade cycles, to the point that it is now approaching a 'critical bottleneck', and if the current administration doesn't come up with a viable action plan, America could see a power problem pretty soon. It is revealed that a single interconnection request scales up to a whopping 5 gigawatts each, which accounts for the power usage of over five million homes. When you factor in that Big Tech sees GW-centres as the new norm, the energy demand is expected to grow tremendously. Gartner predicts that more than 40% of US datacenters might not have power to fully function, as the collective power demand could reach up to 500 Terawatt-hours by 2027. The massive electricity demand has also resulted in power prices rising noticeably, and according to Axios, the monthly electric bills in high-datacenter regions could increase $14-$37 by 2040, not factoring in inflation. Firms like Google and Amazon cannot stop with their AI expansion, or else they will lag behind the race, and the only way they can ensure further expansion in America is if they build their power sources. This is already happening, and some of the more prominent examples include Meta building natural-gas-powered plants for their 'multi-GW' AI clusters, and Microsoft reviving the Three Mile Island nuclear plant project for its supercomputers. Tech companies are moving into the energy business to replace the incompetent US electricity system. This might ease the pressure on the US grid in the longer term, but for now, concerns remain as setting up power plants is a multi-year project, while datacenter expansion is happening rapidly. For now, America needs an effective solution, and interestingly, President Trump's 'AI action plan' actually addresses the concerns and is coming up with a viable solution to uplift the US grid system.
[5]
Why China's Energy Strategy Is Giving It The Edge In The AI Race
Enter your email to get Benzinga's ultimate morning update: The PreMarket Activity Newsletter Given its voracious appetite for energy, it's safe to say that AI innovation is not bottlenecked by silicon; rather, it's gridlocked by power. Two of the biggest players in the sector appear to be on opposite sides of the spectrum regarding the issue. For a long time, the United States has remained entangled in fragmented energy regulation and a power grid increasingly incapable of supporting industrial-scale AI. Meanwhile, China is building the one thing AI truly needs to scale: energy infrastructure. The country has effectively tripled its renewable energy installations, transforming its grid at a speed unmatched anywhere else. In 2022 alone, data from Statista shows it added roughly as much solar capacity as the rest of the world combined. According to the International Energy Agency (IEA), in 2024, China invested more than $625 billion in clean energy. By the end of July that year, its installed wind and solar power capacity had reached 1.206 billion kilowatts, meeting a target it had set for 2030, six years ahead of schedule. At this pace, the economic powerhouse is on track to reach 1,000 GW just from solar power by 2026. Additionally, the IEA projects that the country will spend $88 billion on grid and storage investments in 2025, plus an additional $54 billion to improve coal generation. This shows the groundwork China is laying for a long-term advantage in AI deployment. The U.S., by contrast, is already hitting the grid ceiling. Aging power grid holding back the U.S. However, with concern rising and China pulling ahead, U.S. President Donald Trump has jumped into the conversation. In a major policy move, the head of state signed a "sweeping executive order" to fast-track federal permitting, streamline reviews, and hurry up the construction of all major AI infrastructure projects, including factories, data centers, and power plants. "You're gonna go so fast, you're gonna say, 'wait a minute, this is too fast. I didn't expect it to go this quickly," Trump said, adding that the initiative would bring in "tens of billions of dollars" and ensure the U.S. enjoys "total industrial technological supremacy." This is the clearest sign yet that Washington is finally recognizing that the AI race is not just about smarter models but about who can build faster and scale harder. For a long time, the narrative around artificial intelligence was that supremacy could be achieved primarily through model architecture and advanced GPUs. But this has played out. Compute capacity, the ability to train, fine-tune, and deploy large models at scale, will be key in determining who dominates the sector. However, it relies on access to vast amounts of stable energy, and on this front, the U.S. is woefully underprepared. It's no secret that the country's power grid is plagued by bottlenecks and aging transmission lines, and is facing heavy strain from electrification trends and the proliferation of data centers. Reports indicate that utilities are already overwhelmed by demand from AI-linked facilities. Some locations are even facing multi-year delays for grid access. Things have gotten bad enough that America's largest data center markets, including Virginia and Texas, are now imposing moratoriums or rationing megawatt allocations. Open-source momentum is shifting east Since 2017, Beijing has outlined a national roadmap to lead the world in AI. Startups like DeepSeek and Eshallgo are examples of how this strategy is being operationalized. They have favored lean, fast-deploying models over huge, resource-intensive training runs. This shift reflects a broader ideological divergence: The U.S. is focused on optimizing for closed-source, centralized models with huge capital expenditures, while China is prioritizing efficiency and deployment over perfection. Interestingly, even this centralized Chinese model is beginning to fragment, with open-source developers gaining steam. As American tech entrepreneur Balaji Srinivasan said: "AI is decentralizing to Asia, too. Manus, DeepSeek, Qwen, Kimi. Interestingly, they are also decentralizing out of China. By going open source, and by physically moving out of China." Case in point: in just two weeks since the release of Kimi K2, Alibaba's Qwen3-Coder has already surpassed it, despite being half the size and featuring double the context window. This shows that open-source development is rapidly reaching escape velocity, and deployment models are getting leaner and faster. In enterprise environments, speed of integration beats theoretical model superiority. China understands this. Its AI push isn't about building the "best" foundation model; it's about embedding intelligence into economic infrastructure right now. On the other hand, U.S. policymakers are still fixated on semiconductor choke points. While export bans on Nvidia and AMD chips might temporarily delay Chinese training runs, they do nothing to address the U.S.'s domestic infrastructure shortcomings. If anything, such bans will only force China to speed up investments in self-sufficiency, pushing companies toward custom chips optimized for specific use cases and energy-efficient deployment. There's no denying the United States' edge in foundational research, elite talent, and venture capital. Still, this advantage will become increasingly irrelevant if the outputs of that ecosystem can't scale in the physical world. The collapse of GPU access is not the main constraint. The real pain is the inability to run those GPUs at full capacity without risking grid instability, skyrocketing energy costs, or political backlash. Infrastructure paralysis is the true bottleneck U.S. grid limitations are already manifesting in tangible ways. Energy costs are rising at double the rate of inflation, fueled not by generation expenses, but by transmission and distribution constraints. According to a recent Lawrence Berkeley Lab study, retail energy revenues have increased over 20% since 2019, despite flat consumption, a sign that grid strain is inflating costs across the board. China's vertically integrated strategy, linking energy, compute, and enterprise software, gives it an asymmetrical advantage. And because deployment leads to data feedback loops, performance refinement, and long-tail monetization, first-mover status matters. Each AI solution embedded today creates a platform for expansion tomorrow. Trump's executive order may mark a turning point, at least in tone. It acknowledges that building models alone isn't enough. Without fast-tracked permitting for data centers, energy infrastructure, and next-gen factories, the U.S. will remain mired in regulatory inertia. To counter China's momentum, the U.S. must broaden its AI policy lens. Chips matter, but energy infrastructure matters just as much, if not more. Federal investment in AI-aligned power infrastructure, especially clean, high-density energy sources like nuclear and utility-scale solar, must become a policy priority. Benzinga Disclaimer: This article is from an unpaid external contributor. It does not represent Benzinga's reporting and has not been edited for content or accuracy. Market News and Data brought to you by Benzinga APIs
Share
Copy Link
China's robust energy infrastructure gives it a significant edge over the U.S. in the global AI race, as America struggles with power grid limitations and soaring energy demands from AI data centers.
In the ongoing artificial intelligence (AI) race between the United States and China, a critical factor has emerged that could significantly impact the outcome: energy infrastructure. Recent reports from AI experts and analysts suggest that China's robust and expansive power grid may give it a decisive edge over the U.S., whose aging infrastructure is struggling to keep up with the enormous energy demands of AI data centers 12.
Source: Futurism
AI development and deployment require massive amounts of electricity to power data centers and supercomputers. This energy-intensive nature of AI has become a bottleneck for growth, particularly in the United States. According to Rui Ma, founder of Tech Buzz China, China has effectively solved its energy problem, at least in terms of powering its AI infrastructure 1. The country's substantial investments in advanced hydropower and nuclear technologies have resulted in a secure and inexpensive electricity supply.
China's energy strategy has put it in an enviable position. The country maintains an 80% to 100% power reserve, allowing it to easily absorb the demand from hundreds of new data centers built in recent years 1. This oversupply is so significant that new AI data centers are welcomed as a way to stimulate demand in a market with excess power production 3.
Source: Fortune
David Fishman, a Chinese electricity expert, notes that China adds more electricity demand than Germany's entire annual consumption every single year 3. This massive capacity is the result of decades of deliberate overbuilding and investment in every layer of the power sector, from generation to transmission to next-generation nuclear technology.
In stark contrast, the United States is facing major hurdles with its electricity supply. Meta founder Mark Zuckerberg has warned that power constraints will limit AI growth, as new power plants aren't being built fast enough to satisfy AI's insatiable demand 1. The U.S. power grid typically operates with only a 15% reserve margin, leaving little room to absorb the rapid load increases that AI infrastructure requires 3.
The situation has become so critical that some U.S. tech giants are taking matters into their own hands. Companies like Microsoft, Google, Amazon, and Nvidia are investing in research and development of nuclear reactors 1. Elon Musk's xAI is even resorting to using portable methane gas generators to power its data centers 2.
Source: Benzinga
Recognizing the urgency of the situation, U.S. President Donald Trump has signed an executive order to fast-track federal permitting, streamline reviews, and accelerate the construction of major AI infrastructure projects, including factories, data centers, and power plants 5. This move aims to bring in "tens of billions of dollars" and ensure U.S. "total industrial technological supremacy."
The energy disparity is influencing the approach to AI development in both countries. While the U.S. focuses on optimizing closed-source, centralized models with huge capital expenditures, China is prioritizing efficiency and deployment over perfection 5. This strategy aligns with China's ability to rapidly scale AI applications without energy constraints.
If the United States cannot address its energy infrastructure issues soon, it risks falling behind China in the AI race, even if it possesses more powerful and efficient hardware 1. China's energy advantage allows it to throw vast amounts of power at AI development, potentially gaining the upper hand through sheer computational force.
As the AI race intensifies, the ability to provide stable, abundant, and cost-effective energy for data centers and AI infrastructure may prove to be the deciding factor in determining global AI leadership. China's foresight in energy planning and infrastructure development has positioned it favorably, while the U.S. grapples with the challenges of modernizing its power grid to meet the demands of the AI era.
As nations compete for dominance in space, the risk of satellite hijacking and space-based weapons escalates, transforming outer space into a potential battlefield with far-reaching consequences for global security and economy.
7 Sources
Technology
11 hrs ago
7 Sources
Technology
11 hrs ago
Anthropic has updated its Claude Opus 4 and 4.1 AI models with the ability to terminate conversations in extreme cases of persistent harm or abuse, as part of its AI welfare research.
6 Sources
Technology
19 hrs ago
6 Sources
Technology
19 hrs ago
A pro-Russian propaganda group, Storm-1679, is using AI-generated content and impersonating legitimate news outlets to spread disinformation, raising concerns about the growing threat of AI-powered fake news.
2 Sources
Technology
11 hrs ago
2 Sources
Technology
11 hrs ago
OpenAI has made subtle changes to GPT-5's personality, aiming to make it more approachable after users complained about its formal tone. The company is also working on allowing greater customization of ChatGPT's style.
4 Sources
Technology
3 hrs ago
4 Sources
Technology
3 hrs ago
SoftBank has purchased Foxconn's Ohio plant for $375 million to produce AI servers for the Stargate project. Foxconn will continue to operate the facility, which will be retrofitted for AI server production.
5 Sources
Technology
3 hrs ago
5 Sources
Technology
3 hrs ago