3 Sources
3 Sources
[1]
What's the deal with space-based data centers for AI?
Starcloud has already begun running and training a large language model in space, so it can speak Shakespearean English Terrestrial data centers are so 2025. We're taking our large-scale compute infrastructure into orbit, baby! Or at least, that's what Big Tech is yelling from the rooftops at the moment. It's quite a bonkers idea that's hoovering up money and mindspace, so let's unpack what it's all about - and whether it's even grounded in reality. Let's start with the basics. You might already know that a data center is essentially a large warehouse filled with thousands of servers that run 24/7. AI companies like Anthropic, OpenAI, and Google use data centers in two main ways: AI companies need data centers because they provide the coordinated power of thousands of machines working in tandem on these functions, plus the infrastructure to keep them running reliably around the clock. To that end, these facilities are always online with ultra-fast internet connections, and they have vast cooling systems to keep those servers running at peak performance levels. All this requires a lot of power, which puts a strain on the grid and squeezes local resources. So what's this noise about data centers in space? The idea's been bandied about for a while now as a vastly better alternative that can harness infinitely abundant solar energy and radiative cooling hundreds of miles above the ground in low Earth orbit. Powerful GPU-equipped servers would be contained in satellites, and they'd move through space together in constellations, beaming data back and forth as they travel around the Earth from pole to pole in the sun-synchronous orbit. The thinking behind space data centers is that it'll allow operators to scale up compute resources far more easily than on Earth. Up there, there aren't any constraints of easily available power, real estate, and fresh water supplies needed for cooling. There are a number of firms getting in on the action, including big familiar names and plucky upstarts. You've got Google partnering with Earth monitoring company Planet on Project Suncatcher to launch a couple of prototype satellites by next year. Aetherflux, a startup that was initially all about beaming down solar power from space, now intends to make a data center node in orbit available for commercial use early next year. Nvidia-backed Starcloud, which is focused exclusively on space-based data centers, sent a GPU payload into space last November, and trained and ran a large language model on it. The latest to join the fold is SpaceX, which is set to merge with Elon Musk's AI company xAI in a purported US$1.25-trillion deal with a view to usher in the era of orbital data centers. According to Musk's calculations, it should be possible to increase the number of rocket launches and the data center satellites they can carry. "There is a path to launching 1 TW/year (1 terawatt of compute power per year) from Earth," he noted in a memo, adding that AI compute resources will be cheaper to generate in space than on the ground within three years from now. In an excellent article in The Verge from last December, Elissa Welle laid out the numerous challenges these orbital data centers will have to overcome in order to operate as advertised. For starters, they'd have to safely wade through the 6,600 tons of space debris floating around in orbit, as well as the 14,000-plus active satellites in orbit. Dodging these will require fuel. You've also got to dissipate heat from the space-based data centers, and have astronauts maintain them periodically. And that's to say nothing about how these satellites will affect the work of astronomers or potentially increase light pollution. Ultimately, there's a lot of experimentation and learning to be gleaned from these early efforts to build out compute resources in space before any company or national agency can realistically scale them up. And while it might eventually become possible to do so despite substantial difficulties, it's worth asking ourselves whether AI is actually on track to benefit humanity in all the ways we've been promised, and whether we need to continually build out infrastructure for it - whether on the ground or way up beyond the atmosphere.
[2]
Musk predicts more AI capacity will be in orbit than on earth in 5 years, with SpaceX becoming a 'hyper-hyper' scaler | Fortune
By the beginning of the next decade, AI will primarily become a space-based venture as the cost becomes much more advantageous to operate in orbit, according to SpaceX CEO Elon Musk. In a lengthy, wide-ranging interview with podcaster Dwarkesh Patel and Stripe cofounder and president John Collison on Thursday, the tech billionaire made some of his signature bold predictions about how the AI revolution will play out. Given the enormous energy needs of AI and limits on available land for placing massive arrays of solar panels -- not to mention all the red tape -- building new AI data centers will be much cheaper in orbit, where solar panels are five times more effective than on the ground. "In 36 months, but probably closer to 30 months, the most economically compelling place to put AI will be space," Musk said. "It will then get ridiculously better to be in space. The only place you can really scale is space. Once you start thinking in terms of what percentage of the sun's power you are harnessing, you realize you have to go to space. You can't scale very much on earth." The utility industry isn't able currently to build power plants as rapidly as needed for AI, he added. On top of that, limits on manufacturing gas turbines and wind turbines fast enough represent another bottleneck. Meanwhile, solar panels meant to be used in space are less costly than those designed for use on land because they don't need as much glass or hardening to withstand various weather events, Musk explained. In addition, the cooling needed for data centers is less of an issue in space. Considering the advantage space has over earth, he was asked where AI will be in five years. "If you say five years from now, I think probably AI in space will be launching every year the sum total of all AI on earth," Musk said. "Meaning, five years from now, my prediction is we will launch and be operating every year more AI in space than the cumulative total on earth." While he is infamous for setting incredibly ambitious targets on aggressive timelines, his next one was a whopper, even by his standards. Musk said getting all that AI and solar capacity in space will require about 10,000 launches a year -- or a launch in less than an hour every day. SpaceX is the most prolific rocket company and set a record last year with 165 orbital launches. SpaceX could pull off a 10,000-per-year launch cadence with 20-30 Starship rockets, he added, though the company will make more than that, enabling perhaps 20,000-30,000 launches a year. He pointed out the airline industry has much quicker throughput than that. The number of daily flights around the world tops 100,000. Patel then asked if SpaceX will become an AI hyperscaler. "Hyper-hyper," Musk replied. "If some of my predictions come true, SpaceX will launch more AI than the cumulative amount on Earth of everything else combined." It's already working toward that goal. SpaceX in November launched a test satellite with an AI server from start-up Starcloud. And last month, SpaceX asked the FCC for permission to launch up to 1 million solar‑powered satellites designed as data centers. Of course, there are other challenges associated with operating in space, such as protecting hardware from the sun's radiation and transmitting astronomical amounts data from orbit to earth. SpaceX's Starship rocket is also still in development. But Deutsche Bank said in a note last month the challenges of putting data centers in space are more about engineering than physics. Today's AI hyperscalers see the potential as well and are also looking to go to space. For example, Google's Project Suncatcher looks to pair solar-powered satellites with AI computer chips, and a prototype could launch as soon as next year. OpenAI CEO Sam Altman also considered buying rocket company Stoke Space to put data centers in orbit, the Wall Street Journal reported in December. For its part, SpaceX is an AI company now, after its merger with Musk's xAI. That's as SpaceX is expected to go public this year, raising tens of billions of dollars. During the podcast interview, Musk said more money is available in public markets than private markets, possibly even 100 times more. "I just repeatedly tackle the limiting factor," he added. "Whatever the limiting factor is on speed, I'm going to tackle that. If capital is the limiting factor, then I'll solve for capital. If it's not the limiting factor, I'll solve for something else."
[3]
Elon Musk says space will be the cheapest place for AI data centres in three years
Talking on the Dwarkesh Podcast, co-hosted by Dwarkesh Patel and Stripe CEO John Collison, Musk said Earth has some insurmountable power bottlenecks in scaling AI. "The availability of energy is the issue," Musk said, noting global electricity output outside China remains "pretty close flat" while chip production surges. Billionaire Elon Musk has predicted that within 36 months, or likely sooner, space will be the most economical location for artificial intelligence (AI) data centres, eclipsing Earth-based options due to vastly superior solar efficiency and the absence of a need for batteries. Talking on the Dwarkesh Podcast, co-hosted by Dwarkesh Patel and Stripe CEO John Collison, Musk said Earth has some insurmountable power bottlenecks in scaling AI. "The availability of energy is the issue," Musk said, noting global electricity output outside China remains "pretty close flat" while chip production surges. Musk quipped that he meant to wear a t-shirt that said, "It's always sunny in space". "Because you don't have a day-night cycle, seasonality, clouds, or an atmosphere in space. The atmosphere alone results in about a 30% loss of energy [on Earth]," said Musk, adding, "It will simply not be physically possible to scale power production to the scale needed for AI on Earth." "Any given solar panel can do about five times more power in space than on the ground. You also avoid the cost of having batteries to carry you through the night. It's actually much cheaper to do in space," he said. The billionaire, who is very close to a trillionaire status, said this days after his aerospace giant SpaceX acquired his AI company xAI, to create a $1.25 trillion behemoth. The company said it is acquiring xAI to "form the most ambitious, vertically-integrated innovation engine on (and off) Earth, with AI, rockets, space-based internet, direct-to-mobile device communications and the world's foremost real-time information and free speech platform." On the podcast, Musk explained the multitude of issues facing the AI scaling problem. For space to dominate economically, three conditions must come together, according to Musk: * Earth's power hits a hard ceiling as AI demand explodes * Chip fabs like Musk's planned "TeraFab" outpace energy scaling * Starship achieve thousands of launches yearly. If this happens, Musk said his ecosystem wins. SpaceX alone can launch at that cadence, powering xAI with unlimited gigawatts annually while rivals scrap over turbines and grids. "The only place you can really scale is space," he stressed, eyeing hundreds of gigawatts launched yearly via Starship. Five years out, space AI could surpass all terrestrial capacity combined, lapping US power (500 GW average) repeatedly. When asked about chances of technical issues of data centres in space, Musk said, "At this point, we find our GPUs to be quite reliable. There's infant mortality, which you can obviously iron out on the ground. "Once they start working and you're past the initial debug cycle, they're quite reliable."
Share
Share
Copy Link
Elon Musk claims space-based data centers will become more economical than terrestrial options within 36 months, leveraging superior solar efficiency and eliminating power bottlenecks. SpaceX merged with xAI in a $1.25-trillion deal to build orbital AI infrastructure, with Musk forecasting that AI capacity in orbit could surpass all Earth-based computing within five years.
Elon Musk has made a bold prediction that space will become the most economical location for AI data centers within 36 months, likely closer to 30 months. Speaking on the Dwarkesh Podcast with co-hosts Dwarkesh Patel and Stripe CEO John Collison, Musk argued that Earth faces insurmountable power bottlenecks that make scaling AI infrastructure increasingly difficult
2
3
. According to Musk, solar panels in space can generate about five times more power than those on the ground, thanks to the absence of atmospheric interference, day-night cycles, and weather conditions. "It's always sunny in space," Musk quipped, noting that the atmosphere alone causes about a 30% energy loss on Earth3
.
Source: ET
The timing of Musk's prediction coincides with SpaceX's acquisition of his AI company xAI in a $1.25-trillion deal, creating what the company calls "the most ambitious, vertically-integrated innovation engine on (and off) Earth"
3
. When asked if SpaceX will become an AI hyperscaler, Musk replied "Hyper-hyper," suggesting the company aims to launch more AI computing capacity than the cumulative amount on Earth combined2
. SpaceX has already taken concrete steps toward this vision, launching a test satellite with an AI server from startup Starcloud in November and requesting FCC permission to launch up to 1 million solar-powered satellites designed as orbital data centers2
.
Source: Fortune
Musk emphasized that the availability of solar energy represents the critical limiting factor for AI scaling on Earth. Global electricity output outside China remains "pretty close flat" while chip production surges, creating an impossible mismatch between AI computing capacity demands and available power
3
. The utility industry cannot build power plants rapidly enough for AI needs, and manufacturing gas turbines and wind turbines fast enough presents another bottleneck2
. Space eliminates these constraints by providing unlimited access to solar power without the need for batteries to carry systems through the night, making it "actually much cheaper to do in space," according to Musk3
.Musk outlined an ambitious roadmap requiring approximately 10,000 launches per year—or one launch in less than an hour every day—to achieve his vision of space-based AI dominance
2
. This represents a dramatic increase from SpaceX's record of 165 orbital launches in 2024. Musk believes SpaceX could achieve this launch cadence with 20-30 Starship rockets, though the company plans to build more, potentially enabling 20,000-30,000 launches annually2
. Looking five years ahead, Musk predicted that AI capacity in orbit will surpass all terrestrial AI infrastructure combined, with SpaceX launching and operating more AI in space every year than the cumulative total on Earth2
.SpaceX isn't alone in pursuing space-based data centers. Google has partnered with Earth monitoring company Planet on Project Suncatcher to launch prototype satellites by next year
1
. OpenAI CEO Sam Altman reportedly considered buying rocket company Stoke Space to put data centers in orbit2
. Nvidia-backed Starcloud sent a GPU payload into space last November and successfully trained and ran a large language model on it1
. Startup Aetherflux, initially focused on beaming solar power from space, now intends to make a data center node in orbit available for commercial use early next year1
.
Source: New Atlas
Related Stories
Despite the enthusiasm, orbital data centers face substantial obstacles. Space debris presents a significant hazard, with 6,600 tons of debris and 14,000-plus active satellites already in orbit that must be avoided, requiring fuel for maneuvering
1
. Heat dissipation from space-based data centers remains a challenge, along with the need for periodic maintenance by astronauts1
. Additional concerns include protecting hardware from solar radiation and transmitting astronomical amounts of data from orbit to Earth. However, Deutsche Bank noted in a report that the challenges are more about engineering than physics2
. Musk addressed reliability concerns by noting that GPUs prove quite reliable once past initial debugging cycles, with infant mortality issues that can be resolved on the ground3
.The core appeal of space-based data centers lies in their scalability advantages. In orbit, operators face no constraints from limited real estate, fresh water supplies needed for cooling, or available power that restrict terrestrial compute infrastructure
1
. Solar panels designed for space cost less than those built for Earth because they require less glass and hardening to withstand weather events2
. Musk outlined three conditions that must converge for space to dominate economically: Earth's power must hit a hard ceiling as AI demand explodes, chip fabrication facilities must outpace energy scaling, and Starship must achieve thousands of launches yearly3
. If these conditions align, Musk believes his ecosystem wins, with SpaceX powering xAI with unlimited gigawatts annually while competitors struggle with turbines and grids. Within five years, space AI could surpass all terrestrial capacity combined, repeatedly exceeding US power consumption of 500 GW average3
.Summarized by
Navi
11 Dec 2025•Technology

29 Jan 2026•Technology

02 Jan 2026•Technology

1
Policy and Regulation

2
Technology

3
Technology
