10 Sources
10 Sources
[1]
The AI-energy apocalypse might be a little overblown
Even if AI turns out not to be as much of an energy hog as people are making it out to be, it could still spell out trouble for power grids across the US. Tech companies are already burning through increasing amounts of electricity to train and run new AI models. And they're asking for a lot more electricity as they try to outcompete each other. That rising demand is already starting to reshape the energy system, with utilities scrambling to build out new gas plants and pipelines. But all these plans to reshape the US energy system could be based on an AI bubble. With overexcited investors pumping money into tech companies afraid of missing the bandwagon but still at risk of developing AI tools that ultimately flop, utilities are also faced with a wave of speculation over data centers' energy needs. The uncertainty is unnerving considering the costs that Americans could wind up paying when it comes to higher utility bills and more pollution, a recent report warns. A transition to cleaner and more affordable energy sources has been making progress slowly in the US. That's in peril unless tech companies and utilities demand more transparency and opt for more renewables like solar and wind energy. "While the AI boom provides exciting opportunities, there are many risks to not approaching energy needs with a deliberate and informed response that takes long term impacts into account," Kelly Poole, lead author of the report published this month by shareholder advocacy group As You Sow and environmental organization Sierra Club, said in a briefing with reporters. The nation's fleet of gas-fired power plants would grow by nearly a third if all of the new gas projects proposed between January 2023 and January 2025, as the generative AI industry heated up, come to fruition. The amount of new gas capacity that utilities and independent developers proposed jumped by 70 percent during that time frame, driven in large part by rising data center electricity demand. Prior to the generative AI boom, electricity demand had pretty much flatlined for more than a decade with energy efficiency gains. But new data centers, souped-up for AI, are a lot more energy-hungry than they have been in the past. A rack of computers in a traditional data center might use 6-8 kilowatts of power -- roughly equivalent to the power used by three homes in the US, Dan Thompson, a principal research analyst at S&P Global, explained in the briefing. AI, however, requires more powerful computer chips to run more complicated tasks. The power required to run one of those high-density racks equals about 80 to 100 homes' worth of power, or upward of 100 kilowatts, according to Thompson. "Essentially what you're looking at is a small town's worth of power being deployed," he said. Why does that matter? Power grids basically function as a precarious balancing act. If power supply can't meet demand growth, it could lead to higher utility bills and potential outages. On the other hand, overbuilding new capacity risks creating stranded assets that utilities and their customers wind up paying for regardless of whether or not they actually need them in the long term. That's why it's so important to try to get an accurate forecast of future demand. And while AI does use a lot of energy, projections for the future get murky. "Speculators are flooding the market," the report says, seeking to build and flip data centers. Trying to get ahead of long wait times to connect to the power grid, some of those speculators are requesting power even before they've got the capital or customers lined up to ensure they can bring a project to the finish line. There could also be some double or triple counting (or more) going on when it comes to forecasting AI energy demand because of developers approaching more than one utility to get several quotes. In the Southeast, a major hub for data centers, utilities are projecting as much as four times more demand growth compared to independent analyses of industry trends, according to a report earlier this year from the Institute for Energy Economics and Financial Analysis (IEEFA). Nationally, utilities are preparing for 50 percent more demand growth than the tech industry is expecting, a separate report from December 2024 states. Utilities themselves have recognized this risk on recent earnings calls. Proposed projects trying to connect to the grid "may be overstated anywhere from three to five times what might actually materialize," Jim Burke, CEO of Texas-based Vistra Energy, said in a Q1 earnings call this year. Despite the uncertainty, they're still building out new gas power plants and pipelines to meet that demand. After all, building new infrastructure is one of the most lucrative ways for a utility to increase profits. And right now, the Trump administration -- whose campaign was buoyed by oil and gas contributions -- is incentivizing reliance on fossil fuels. In Louisiana, for example, local utility Entergy proposed building three new gas plants to power a giant new Meta data center. The data center is estimated consume as much electricity as 1.5 million homes and lead to 100 million tons of carbon emissions over 15 years. It's a stark contrast from the Biden administration's goal of getting the power grid to run on 100 percent carbon pollution-free energy by 2035. The only way to stop climate change in its tracks is to get rid of planet-heating pollution from fossil fuels. Building a rush of new gas infrastructure obviously moves the nation in the opposite direction. There are solutions to minimize all these risks, As You Sow and Sierra Club point out in their report. Utilities can require developers to disclose the number of other utilities they've brought their data center proposal to and how far along they are in finalizing a project. When inking contracts, they can also require long-term service agreements, hike-up nonrefundable deposits, and raise fees for canceling a project. Tech companies clearly have a big role to play, too, by improving the energy efficiency of their technologies and investing in renewables. For years, tech giants including Amazon, Meta, and Google have been top corporate purchasers of renewable energy. Inking those kinds of long-term agreements to build out new solar and wind farms can have even more impact now, counteracting the Trump administration's rollback of financial incentives for renewables, if companies are willing to prioritize their own sustainability goals as much as their AI ambitions.
[2]
Will OpenAI Really Build 60 Football Fields Worth of AI Infrastructure Per Week?
Emily is an experienced reporter who covers cutting-edge tech, from AI and EVs to brain implants. She stays grounded by hiking and playing guitar. OpenAI CEO Sam Altman has a lofty new vision to "create a factory that can produce a gigawatt of new AI infrastructure every week." His Tuesday blog post is light on details, but it's safe to say he's talking about finding ways to satisfy the company's never-ending need for more computing power for its latest products. On the one hand, by using the word "factory," Altman may be suggesting he wants to create a slick manufacturing facility where robots assemble ultra-powerful GPUs day and night. OpenAI announced a partnership with Nvidia this week, which the companies are calling the "biggest AI infrastructure deployment in history." Nvidia will provide "millions" of GPUs to help "scale OpenAI's compute with multi-gigawatt data centers." On the other hand, it's also possible that Altman's factory might just be a data center, perhaps many of them. Nvidia CEO Jensen Huang often refers to data centers as AI factories, portraying them as the engine behind the next wave of industrialization. But in reality, they are mostly glorified equipment warehouses. Altman admits that producing a gigawatt of new AI infrastructure every week is "extremely difficult [and will] take us years" to figure out. Financing is another hurdle to overcome. But for the sake of argument, let's say Altman could achieve what he's laid out. What would it look like to physically manifest a gigawatt of computing power every week? How much space would that take up? 60 Football Fields? The Math Is Not Looking Good The newest, most advanced data centers are massive. An Amazon data center in Indiana is 1,200 acres, The New York Times reports, enough to fit 10 Malls of America. Meta CEO Mark Zuckerberg wants to build data centers large enough to cover most of Manhattan. To estimate how much land OpenAI would need to build a one-gigawatt facility with today's technology, we can look at the acreage of its Texas data center, which is currently under development as part of the Stargate project. According to developer Crusoe, that project is "approximately 4 million square feet, and a total power capacity of 1.2 gigawatts (GW)." It will consume as much electricity as an entire city on its own. If OpenAI is getting 1.2 gigawatts out of 4 million square feet, that means to build one gigawatt, it currently needs 3.33 million square feet. That's the equivalent of about 60 football fields. Every football field needs extra room around it for a snack stand and bleachers. Data centers will also need some extra padding around the edge for things like parking or equipment storage (probably not for buying hot dogs). Altman says the extra compute power is necessary to make "amazing things" possible, such as curing cancer or figuring out "how to provide customized tutoring to every student on Earth." If computing power is limited, "we'll have to choose which one to prioritize; no one wants to make that choice, so let's go build." Yet OpenAI published a study last week that found 73% of ChatGPT conversations are not about work, curing cancer, or education. They are mostly people seeking help with decisions in their personal lives. More Land Than You Can Imagine, But Very Few Jobs Given their size -- and horizontal versus vertical design -- the facilities are often built in rural areas with vast expanses of open land. But while major companies moving into a town often translates into new jobs, data centers don't need hundreds of humans to man a production line. Amazon's $20 billion data center investment in Northeast Pennsylvania, for example, will employ 1,250 people across dozens of facilities. By comparison, Ford's $5 billion EV factory announced in August will employ 4,000 between two plants in Kentucky and Michigan. That's not to say that progress is not a noble goal. But the reality is that data centers are inefficient, guzzling water and consuming huge amounts of electricity. Local communities across the US are fighting back, as reported by many publications, including Futurism, NPR, CNET, and the Washington Post. There is no shortage of horror stories, like a 24/7 hum coming from the buildings, keeping Virginia residents up at night, Business Insider reports, or when kitchen taps ran dry in Georgia after Meta built a facility nearby, The New York Times reports. Since the grid does not have enough power, some areas are seeing enormous spikes in utility bills, up to 20% in the Northeast, as residents compete with data centers. That might be why Altman refers to quickening "new energy production" in his blog post as well. For Big Tech, it's all worth it. Data centers are giant moneymakers, or "the literal key to increasing revenue," as Altman puts it. That's why there is no shortage of new buildouts announced every week. The Trump administration, meanwhile, is working on slashing red tape to help tech companies build data centers faster.
[3]
Nvidia's and OpenAI's 'monumental' data center plan has an equally massive problem: Where to find the power
Nvidia CEO Jensen Huang told CNBC this week that the chipmaker's AI infrastructure plan with OpenAI is "monumental in size." Their plan is so big that it will push the boundaries of what is possible. The chipmaker and the AI lab are aiming to build at least 10 gigawatts of data centers. This will sap a massive amount of power at a time when the electric grid is already strained . Attempts to deploy more power have faced economic and political constraints that make a fast fix unlikely. Ten gigawatts is roughly equivalent to the annual power consumption of 8 million U.S. households, according to a CNBC analysis of data from the Energy Information Administration. It is about the same amount of power as New York City's baseline peak summer demand in 2024, according to the New York Independent System Operator , the state electric grid. "There's never been an engineering project, a technical project of this complexity and this scale -- ever," Huang told CNBC on Monday. Nvidia and OpenAI have provided no information on when and where the sites will be built, other than disclosing that the first gigawatt will come online in the second half of 2026. When CNBC reached out for more detail on Tuesday, Nvidia declined to comment. It's unclear where all the electricity that the companies need will come from. The U.S. is forecast to add 63 gigawatts of power to the grid this year, according to EIA data . Nivida's and OpenAI's 10 gigawatts of data centers are equivalent to a big chunk, 16%, of the new power that will be deployed in 2025. The Trump administration is pushing for data centers to use fossil fuels, particularly natural gas, but orders for new gas turbines face long wait times with GE Vernova sold out through 2028. The U.S. is forecast to add just 4.4 gigawatts of new gas generation this year, according to EIA. The tech sector and the White House are working to build new nuclear plants, but it will take years for reactors to connect to the grid. The recent big expansion at Plant Vogtle in Georgia took more than a decade to complete. And the small advanced reactors backed by the tech sector are not expected to reach a commercial stage until the end of the decade at earliest. This leaves renewable power as the most viable, quickly deployable source of electricity to meet the demand from Nvidia and OpenAI in the near term. More than 90% of the new power that the U.S. is expected to add this year will come from solar, wind or battery storage, according to EIA. "The power requirement is largely going to be coming from the new energy sector or not at all," said Kevin Smith, CEO of Arevon, a solar and battery storage developer headquartered in Scottsdale, Arizona, that's active in 17 states. But the White House has effectively declared war on renewable power. President Donald Trump said last month that the federal government will not approve any more solar and wind . Interior Secretary Doug Burgum's office is now reviewing all permits for solar and wind projects. Even projects on private land could be hampered by the Trump administration as such efforts often need permits from federal agencies like the U.S. Fish and Wildlife Service. Trump's tariffs, uncertainty over permitting, and the end of key tax credits will lead to a slowdown in renewable deployment in the coming years that could challenge data center deployment, Smith and executives at other big renewable developers warned CNBC last month . "The panic in the data center, AI world is probably not going to set in for another 12 months or so, when they start realizing that they can't get the power they need in some of these areas where they're planning to build data centers," Smith told CNBC in August. "Then we'll see what happens," Smith said. "There may be a reversal in policy to try and build whatever we can and get power onto the grid."
[4]
Sam Altman's vision for AI is huge - but there's just one thing standing in his way
Altman suggests that AI may become something we consider a fundamental human right In a new blog post called Abundance Intelligence, Sam Altman, CEO of OpenAI, lays out the benefits of more computing power for AI and calls for increased investment in AI infrastructure. "If AI stays on the trajectory that we think it will, then amazing things will be possible", he writes. While Altman makes no definite predictions about what these amazing things will be, he is willing to muse on the potential benefits of increased computational power could mean for the future. "Maybe with 10 gigawatts of compute, AI can figure out how to cure cancer. Or with 10 gigawatts of compute, AI can figure out how to provide customized tutoring to every student on earth." He also issues a stark warning: "if we are limited by compute, we'll have to choose which one to prioritize; no one wants to make that choice, so let's go build." Altman doesn't write blog posts just to pontificate; he generally uses them to outline the direction for the next phase of OpenAI's expansion, and it's clear that it's now all about increasing "compute". Compute is the word Altman uses as a shorthand for the raw horsepower necessary to run and train LLMs like ChatGPT. In the real world, that horsepower equates to data centers - vast warehouse-sized facilities full of servers, networking gear, and cooling equipment, and as you can imagine, they need large amounts of electricity to run and function. Just yesterday on X.com, Altman tweeted a video showing progress on OpenAI and Oracle's latest massive data centers in Abilene, Texas. Part of the $500 billion Stargate Project, with five more data centers opening in the US soon. As you can see from the video, the sheer scale of it is impressive. As we reported on Tuesday, Nvidia is investing $100bn in OpenAI, and will start by deploying as much power as 10 nuclear reactors. In his blog post, Altman lays out exactly what the goal for OpenAI is when it comes to data centers: "Our vision is simple: we want to create a factory that can produce a gigawatt of new AI infrastructure every week. " That's an astonishing ambition, and one he realizes will be challenging: "The execution of this will be extremely difficult; it will take us years to get to this milestone and it will require innovation at every level of the stack, from chips to power to building to robotics. But we have been hard at work on this and believe it is possible." Reflecting the new political desire for homegrown technology in the U.S., Altman writes: "We are particularly excited to build a lot of this in the US; right now, other countries are building things like chips fabs and new energy production much faster than we are, and we want to help turn that tide." Altman's vision of the future is clearly going to require some incredible infrastructure building to achieve, with even more data centers than we currently have in production. The massive consumption of power required has attracted its fair share of criticism as well. Partly, this is due to the sheer environmental impact of building increasingly large data centers, but also because scaling AI compute power has so far failed to produce AGI, and there's no indication that it will. While there is no mention of AGI in his most recent missive, it has been a popular theme of Altman's previous blog posts. He does, however, talk about what is going to happen as AI gets smarter: "access to AI will be a fundamental driver of the economy, and maybe eventually something we consider a fundamental human right. Almost everyone will want more AI working on their behalf." While it seems that achieving AGI remains as elusive as ever, there's no reason to think that OpenAI's plans for the future won't be as innovative as we've come to expect, and Altman is keen to reveal them soon. "Over the next couple of months, we'll be talking about some of our plans and the partners we are working with to make this a reality", he writes, before ending on the enigmatic: "we have some interesting new ideas", and I can't wait to see what those will be.
[5]
OpenAI's New Data Centers Will Draw More Power Than the Entirety of New York City, Sam Altman Says
"Ten gigawatts is more than the peak power demand in Switzerland or Portugal." Earlier this week, OpenAI announced a "strategic partnership" with AI chipmaker Nvidia in which the duo of tech giants will build and deploy upwards of 10 gigawatts of AI data centers. Nvidia will invest up to $100 billion in the project, an enormous project that could end up requiring an astronomical amount of electricity to run. As Fortune reports, the planned data centers would consume as much as the entire city of New York City -- and the Sam Altman-led company isn't stopping there. Existing projects tied to president Donald Trump's Stargate initiative could add another seven gigawatts, or roughly as much as San Diego used during last year's devastating heat wave. "Ten gigawatts is more than the peak power demand in Switzerland or Portugal," Cornell University energy-systems engineering professor Fengqi You told Fortune. "Seventeen gigawatts is like powering both countries together." OpenAI and tech giant Oracle already have an enormous Stargate data center in Abilene, Texas, which draws enough electricity to power half a million homes. Five new projects are expected to total seven gigawatts, as part of Trump's half-a-trillion-dollar AI data center initiative. It's an almost unfathomable escalation in the power usage of AI -- and computing as a whole. "It's scary because... now [computing] could be 10 percent or 12 percent of the world's power by 2030," University of Chicago professor of computer science Andrew Chien told Fortune. "We're coming to some seminal moments for how we think about AI and its impact on society." To the AI industry, it's all part of the plan. "Everything starts with compute," Altman said in a statement accompanying its Nvidia partnership announcement. "Compute infrastructure will be the basis for the economy of the future, and we will utilize what we're building with NVIDIA to both create new AI breakthroughs and empower people and businesses with them at scale." The industry's doubling down on building out AI infrastructure has been accompanied by major environmental concerns, with tech giants admitting that they're falling far short of their own carbon emission goals. While the tech's exact carbon footprint remains elusive, AI data centers are putting a major strain on local water supplies to keep hardware cool. Additional pressure on power grids will also lead to a rise in carbon dioxide emissions, unless the AI industry finds a way to pivot to renewable energy sources in a meaningful way. You told Fortune that it may eventually become inevitable for companies to switch to nuclear plants, which could "take years to permit and build." "In the short term, they'll have to rely on renewables, natural gas, and maybe retrofitting older plants," he added. As companies continue to pour tens of billions of dollars into infrastructure buildouts, the tech's carbon footprint is expected to grow. It's an unfortunate reality that the industry will have to reckon with one way or the other, especially considering the ongoing human-activity-fueled climate crisis. "They told us these data centers were going to be clean and green," Chien told Fortune. "But in the face of AI growth, I don't think they can be. Now is the time to hold their feet to the fire."
[6]
Altman says concerns over OpenAI's fast growth are 'natural'
OpenAI chief executive Sam Altman says he understands concerns about the speed of the company's expansion -- but insists its building program is necessary to meet demand for artificial intelligence. "People are worried. I totally get that. I think that's a very natural thing," Altman told CNBC. "We are growing faster than any business I've ever heard of before." The company has announced plans in recent days to build a network of massive data centers with Oracle, Nvidia and SoftBank. Together, the sites would require 17 gigawatts of electricity, enough power for more than 13 million U.S. homes. But Altman said the breakneck expansion is designed to meet rising demand for OpenAI's products, pointing to a tenfold increase in ChatGPT use in 18 months. "This is what it takes to deliver AI," he said. "Unlike previous technological revolutions or previous versions of the internet, there's so much infrastructure that's required, and this is a small sample of it." Each site is expected to cost around $50 billion, with overall spending projected at $850 billion. HSBC analysts on Monday estimated that global investment in AI infrastructure could reach $2 trillion. The CEO's comments come amid increasing concerns that the AI industry is a bubble. Yesterday, Bain & Co. warned that companies developing AI are committing vast sums to new data centers but are not on course to generate the income to pay for them. In its annual Global Technology Report, Bain said AI firms will need roughly $2 trillion in yearly revenue by 2030 to cover the computing power needed to meet demand, but will miss that by $800 billion as services such as ChatGPT bring in less money than the infrastructure costs required to support them. OpenAI is losing billions of dollars a year as it focuses on expansion, though the company has said it expects to turn cash-flow positive by 2029. Altman admitted that electricity is the biggest challenge facing the industry, but added that cycles of overbuilding and retrenchment are part of every major technological shift. "People will get burned on overinvesting and people also get burned on underinvesting and not having enough capacity," Altman said Tuesday. "Smart people will get overexcited, and people will lose a lot of money. People will make a lot of money. But I am confident that long term, the value of this technology is going to be gigantic to society."
[7]
Sam Altman's AI empire will devour as much power as New York City and San Diego combined. Experts say it's 'scary' | Fortune
Picture New York City on a sweltering summer night: every air conditioner straining, subway cars humming underground, towers blazing with light. Now add San Diego at the peak of a record-breaking heat wave, when demand shot past 5,000 megawatts and the grid nearly buckled. That's almost the scale of electricity that Sam Altman and his partners say will be devoured by their next wave of AI data centers -- a single corporate project consuming more power, every single day, than two American cities pushed to their breaking point. The announcement is a "seminal moment" that Andrew Chien, a professor of computer science at the University of Chicago, says he has been waiting for a long time to see what's coming to fruition. "I've been a computer scientist for 40 years, and for most of that time computing was the tiniest piece of our economy's power use," Chien told Fortune. "Now it's becoming a large share of what the whole economy consumes." "It's scary because computing was always the tiniest piece of our economy's power use," he said. "Now it could be 10% or 12% of the world's power by 2030. We're coming to some seminal moments for how we think about AI and its impact on society." This week, OpenAI announced a plan with NVIDIA to build AI data centers consuming up to 10 gigawatts of power, with additional projects totaling 17 gigawatts already in motion. That's roughly equivalent to powering New York City -- which uses 10 gigawatts in the summer -- and San Diego during the intensive heat wave of 2024, when more than 5 gigawatts were used. Or, as one expert put it, it's close to the total electricity demand of Switzerland and Portugal combined. "It's pretty amazing," Chien said. "A year-and-a-half ago they were talking about five gigawatts. Now they've upped the ante to 10, 15, even 17. There's an ongoing escalation." Fenqi You, an energy systems professor at Cornell University, who also studies AI, agreed. "Ten gigawatts is more than the peak power demand in Switzerland or Portugal," he told Fortune. "Seventeen gigawatts is like powering both countries together." "So you're talking about an amount of power that's comparable to 20% of the whole Texas grid," Chien said. "That's for all the other industries -- refineries, factories, households. It's a crazy large amount of power." Altman has framed the build-out as necessary to keep up with AI's runaway demand. "This is what it takes to deliver AI," he said in Texas. Usage of ChatGPT, he noted, has jumped tenfold in the past 18 months. Altman has made no secret of his favorite: nuclear. He has backed both fission and fusion startups, betting that only reactors can provide the kind of steady, concentrated output needed to keep AI's insatiable demand fed. "Compute infrastructure will be the basis for the economy of the future," he said, framing nuclear as the backbone of that future. Chien, however, is blunt about the near-term limits. "As far as I know, the amount of nuclear power that could be brought on the grid before 2030 is less than a gigawatt," he said. "So when you hear 17 gigawatts, the numbers just don't match up." With projects like OpenAI's demanding 10 or 17 gigawatts, nuclear is "a ways off, and a slow ramp, even when you get there." Instead, he expects wind, solar, natural gas, and new storage technologies to dominate. Fenqi You, an energy systems expert at Cornell, struck a middle ground. He said nuclear may be unavoidable in the long run if AI keeps expanding, but cautioned that "in the short term, there's just not that much spare capacity" -- whether fossil, renewable, or nuclear. "How can we expand this capacity in the short term? That's not clear," he said. He also warned that timeline may be unrealistic. "A typical nuclear plant takes years to permit and build," he said. "In the short term, they'll have to rely on renewables, natural gas, and maybe retrofitting older plants. Nuclear won't arrive fast enough." The environmental costs loom large for these experts, too. "We have to face the reality that companies promised they'd be clean and net zero, and in the face of AI growth, they probably can't be," Chien said. Ecosystems could come under stress, Cornell's You said. "If data centers consume all the local water or disrupt biodiversity, that creates unintended consequences," he said. The investment figures are staggering. Each OpenAI site is valued at roughly $50 billion, adding up to $850 billion in planned spending. Nvidia alone has pledged up to $100 billion to back the expansion, providing millions of its new Vera Rubin GPUs. Chien added that we need a broader societal conversation about the looming environmental costs of using that much electricity for AI. Beyond carbon emissions, he pointed to hidden strains on water supplies, biodiversity, and local communities near massive data centers. Cooling alone, he noted, can consume vast amounts of fresh water in regions already facing scarcity. And because the hardware churns so quickly -- with new Nvidia processors rolling out every year -- old chips are constantly discarded, creating waste streams laced with toxic chemicals. "They told us these data centers were going to be clean and green," Chien said. "But in the face of AI growth, I don't think they can be. Now is the time to hold their feet to the fire."
[8]
OpenAI CEO Admits 'People Are Worried' About the Company's Massive Infrastructure Projects: 'Growing Faster Than Any Business'
Altman says that the projects are necessary to keep pace with demand for AI. OpenAI CEO Sam Altman announced on Tuesday what he called the "largest infrastructure project of the modern Internet era" -- an expansive AI project with building sites in Texas, Ohio, and New Mexico. OpenAI revealed on its blog that it plans to build a network of five AI data centers over the next few years through Stargate, a collaboration with Oracle, Nvidia, and SoftBank that was announced in January. The new structures will require seven gigawatts of electricity, which is about three-and-a-half Hoover Dams or enough power to fuel over five million U.S. homes. It requires about $400 billion in investment over the next three years. If this sounds big, even Altman acknowledges the scale of the investment. Related: Your Electricity Bill Is Going Up Thanks to AI -- Even If You've Never Used ChatGPT: 'Higher Bills for Everyday Households' "People are worried," Altman told CNBC on Wednesday. "I totally get that. I think that's a very natural thing. We are growing faster than any business I've ever heard of before." Altman said that there had been a tenfold increase in the number of ChatGPT users within the past 18 months. According to Bloomberg, 700 million people use ChatGPT every week. Building AI infrastructure is "what it takes to deliver AI," Altman explained to CNBC. Unlike other technological revolutions, there is "so much infrastructure that's required," he noted. "This is a small sample of it," Altman told the outlet. Critics have cautioned that there could be an AI spending bubble, and Altman himself has also warned of it. In August, at a press dinner, Altman said that investors were "overexcited" and that AI startups have achieved "insane" valuations. Related: Saying 'Please' and 'Thank You' to ChatGPT Costs OpenAI 'Tens of Millions of Dollars' "Are we in a phase where investors as a whole are overexcited about AI?" he said at the dinner. "My opinion is yes." Meanwhile, tech giants like Oracle, Nvidia, and Microsoft all depend heavily on OpenAI to help build AI and data center infrastructure. Earlier this week, Nvidia announced plans to invest up to $100 billion in OpenAI to build more data centers to develop and run AI models.
[9]
AI Factory That Builds Itself : Sam Altman's Vision for the Future
What if the future of artificial intelligence didn't just rely on smarter algorithms or faster chips, but on a innovative system that could build itself? Imagine a factory so advanced that it churns out a gigawatt of AI infrastructure every week, a self-sustaining, automated powerhouse capable of scaling humanity's technological ambitions at an unprecedented pace. This isn't science fiction; it's the bold vision of Sam Altman, CEO of OpenAI, who envisions a "machine that builds the machine." In a world where AI's potential is limited only by the availability of compute power and energy, Altman's concept could redefine the very foundation of innovation. But can such a system truly deliver on its promise, or does it raise as many questions as it answers? Wes Roth uncovers how this audacious idea could transform not just the AI industry, but global infrastructure itself. From the intricate challenges of scaling compute resources to the new energy solutions like nuclear fusion that could power this vision, every piece of the puzzle reveals a glimpse into the future of technology. Along the way, we'll examine the societal and economic implications of automating AI infrastructure, as well as the regulatory frameworks needed to ensure equitable access. This isn't just a story about machines, it's a story about how humanity might harness them to solve its greatest challenges. Could this be the key to unlocking a sustainable, AI-driven future? The advancement of AI hinges on the availability of compute power, which serves as the backbone for training and deploying sophisticated models. The industry is now setting its sights on unprecedented levels of computational capacity, with goals such as achieving 10 gigawatts or even 100 gigawatts. These targets demand more than just innovative hardware; they require a fundamental rethinking of global energy production and infrastructure. The challenge is significant. As AI models grow more complex, their computational requirements are outpacing the available supply. This scarcity poses a risk of slowing progress, making it essential to find innovative solutions. Companies like Nvidia are addressing this by developing next-generation GPUs and chips specifically optimized for AI workloads. These advancements are crucial for powering applications such as large language models, robotics, and advanced analytics. However, scaling compute power is not merely a technical challenge. It also involves tackling the immense energy demands that accompany these advancements. Without sustainable energy solutions, the expansion of AI infrastructure could strain global resources and hinder progress. AI infrastructure is a major consumer of energy, requiring vast amounts to power data centers, train AI models, and maintain global networks. The scale of these energy demands has prompted significant investment in advanced energy technologies to ensure sustainability. Two of the most promising solutions are nuclear fusion and solar power. Nuclear fusion, often regarded as the "holy grail" of energy, offers the potential for virtually limitless power with minimal environmental impact. If successfully developed, it could provide the energy needed to support AI infrastructure on a massive scale. Meanwhile, solar power is emerging as a viable decentralized energy solution, particularly in regions with abundant sunlight. Advances in solar technology are making it more efficient and cost-effective, allowing it to play a key role in meeting the energy demands of AI systems. These energy innovations are not just about meeting current needs, they are about creating a sustainable foundation for the future. By integrating renewable energy sources into AI infrastructure, the industry can reduce its environmental footprint while continuing to scale. Here are more detailed guides and articles that you may find helpful on AI infrastructure. Sam Altman, CEO of OpenAI, has proposed a fantastic approach to addressing the compute scarcity problem: a factory designed to produce a gigawatt of AI infrastructure every week. This "machine that builds the machine" would integrate advancements in robotics, chip manufacturing, and energy production to create a self-sustaining system for scaling AI capabilities. This vision goes beyond simply meeting the immediate demand for compute resources. By automating the production of AI hardware, it could significantly lower costs, accelerate deployment timelines, and establish a new standard for infrastructure development. Altman's concept represents a forward-thinking strategy to make AI technologies more accessible and scalable on a global scale. The potential benefits of this approach are immense. Automating infrastructure production could provide widespread access to access to AI, allowing smaller organizations and developing nations to use its capabilities. It could also drive innovation by removing bottlenecks in the supply chain, allowing researchers and developers to focus on advancing AI applications rather than grappling with resource limitations. The global expansion of AI infrastructure presents both opportunities and challenges. While countries like the United States and China are leading the charge, other regions are also making significant investments to establish themselves as players in the AI ecosystem. However, the development of this infrastructure cannot occur without a robust regulatory framework. Regulation will play a pivotal role in shaping the future of AI infrastructure. Policymakers must address critical issues such as data privacy, energy consumption, and equitable access to AI technologies. Striking the right balance between fostering innovation and making sure responsible development is essential. Without clear guidelines, the rapid growth of AI infrastructure could lead to unintended consequences, such as environmental degradation or unequal access to resources. International collaboration will be crucial in navigating these challenges. By working together, nations can establish standards and best practices that promote sustainable and equitable development. This cooperative approach will be key to creating a global AI ecosystem that benefits all of humanity. The expansion of AI infrastructure has far-reaching implications for the global economy and society. AI is poised to drive economic growth by transforming industries such as healthcare, education, and manufacturing. For example, AI-powered tools could accelerate drug discovery, personalize education to meet individual needs, and optimize supply chains for greater efficiency. Beyond economic benefits, AI has the potential to improve quality of life on a global scale. Some experts argue that access to AI should be considered a fundamental human right, given its ability to address critical challenges such as climate change, food security, and healthcare disparities. However, others caution against overestimating AI's capabilities, emphasizing the importance of setting realistic expectations and making sure responsible development. The societal impact of AI will depend largely on how its infrastructure is developed and deployed. By prioritizing sustainability, equity, and innovation, the industry can maximize the benefits of AI while minimizing its risks. This approach will be essential to building a future where technology serves humanity as a whole.
[10]
Sam Altman Calls for 'Abundant Intelligence' in Blog Post | PYMNTS.com
"Our vision is simple: we want to create a factory that can produce a gigawatt of new AI infrastructure every week," Altman wrote. "The execution of this will be extremely difficult; it will take us years to get to this milestone and it will require innovation at every level of the stack, from chips to power to building to robotics. But we have been hard at work on this and believe it is possible." The vision comes as OpenAI's business is expanding rapidly. As PYMNTS reported, the company doubled its annual revenue to $12 billion in 2025, reflecting demand from enterprises and consumers for its generative AI products. Looking ahead, OpenAI forecasts that spending will jump to $115 billion through 2029, underscoring the capital-intensive path it has chosen. Altman's blog also follows a series of landmark agreements designed to expand OpenAI's capacity. As PYMNTS reported, Nvidia will invest up to $100 billion in OpenAI beginning in 2026, the largest private-company investment on record. At the same time, Oracle struck a $300 billion cloud partnership with OpenAI, anchoring the company's next generation of infrastructure. Together, these agreements highlight OpenAI's strategy to lock in compute and cloud resources on an unprecedented scale. Altman wrote that the next phase of AI progress depends on building systems capable of delivering intelligence reliably and at scale. "If AI stays on the trajectory that we think it will, then amazing things will be possible," he wrote. "Maybe with 10 gigawatts of compute, AI can figure out how to cure cancer. Or with 10 gigawatts of compute, AI can figure out how to provide customized tutoring to every student on earth. If we are limited by compute, we'll have to choose which one to prioritize; no one wants to make that choice, so let's go build." The moves also position OpenAI less as a research organization and more as an industrial player tasked with creating the foundation for global AI access. In Altman's framing, "as AI gets smarter, access to AI will be a fundamental driver of the economy, and maybe eventually something we consider a fundamental human right. Almost everyone will want more AI working on their behalf," he said. This marks a turning point similar to the early days of the internet, when breakthroughs in networking had to be matched with vast investment in physical cables, servers and data centers. Just as those investments eventually made the internet a utility, Altman suggested that building AI infrastructure today will create the foundation for widespread, low-cost access to intelligence tomorrow. Altman's latest post sets a marker for ambition in the AI sector. By linking OpenAI's strategy to industrial-scale infrastructure and record-breaking partnerships, he signaled that the race to expand AI is moving into its next phase. If "abundant intelligence" becomes reality, enterprises could lower costs and accelerate adoption by tapping into reliable AI capacity. Consumers could gain access to tools that make daily life more efficient, while industries such as healthcare, logistics and education could reimagine how they deliver services.
Share
Share
Copy Link
OpenAI and Nvidia's ambitious plan to build massive AI data centers raises concerns about energy consumption, environmental impact, and infrastructure challenges.
OpenAI and Nvidia have announced a groundbreaking partnership to build and deploy over 10 gigawatts of AI data centers, with Nvidia investing up to $100 billion in the project
4
. This massive undertaking, described by Nvidia CEO Jensen Huang as "monumental in size," is set to push the boundaries of what's possible in AI infrastructure3
.Source: CNBC
The scale of this project is staggering, with the planned data centers expected to consume as much power as the entire city of New York
5
. To put this into perspective, 10 gigawatts is equivalent to the annual power consumption of approximately 8 million U.S. households3
.Source: Futurism
This massive increase in power demand poses significant challenges for the energy sector and raises environmental concerns. The U.S. is forecast to add 63 gigawatts of power to the grid this year, with OpenAI and Nvidia's project alone accounting for about 16% of this new power
3
.Finding sufficient power sources for these data centers is a major hurdle. While renewable energy seems the most viable quick-deployment option, political constraints under the Trump administration may hinder this approach
3
. The tech sector is also exploring nuclear power, but new plants could take years to come online3
.OpenAI CEO Sam Altman envisions creating "a factory that can produce a gigawatt of new AI infrastructure every week"
4
. However, this ambitious goal faces numerous challenges, including chip production, power generation, and robotics innovation4
.Related Stories
The rapid expansion of AI infrastructure is raising alarm bells among environmentalists. Data centers are already straining local water supplies for cooling, and the increased power demand could lead to a rise in carbon dioxide emissions . The tech industry's ability to meet its carbon emission goals is being questioned as the AI sector continues to grow .
Source: PC Magazine
Despite these challenges, Altman remains optimistic about the potential of increased compute power. He suggests that with sufficient computational resources, AI could tackle major challenges like curing cancer or providing personalized education globally
4
. However, critics argue that simply scaling up compute power has not yet led to the development of artificial general intelligence (AGI)4
.As the AI industry continues its rapid expansion, finding a balance between technological advancement and environmental sustainability remains a critical challenge. The coming years will likely see intense debate and innovation as tech giants like OpenAI and Nvidia push the boundaries of AI infrastructure while grappling with its enormous energy demands.
Summarized by
Navi
[1]