SpaceX files plan for 1 million satellites as orbital AI data centers, but experts warn of major hurdles

2 Sources

Share

Elon Musk announced plans to merge SpaceX and xAI to launch a constellation of 1 million satellites operating as orbital data centers. While experts acknowledge the concept isn't fantasy, they warn of devastating environmental effects and technical challenges including heat dissipation, cosmic radiation, and launch logistics that make meaningful scale unlikely before 2030.

SpaceX Files Ambitious Plan for Orbital AI Data Centers

Elon Musk has announced plans to merge two of his companies—SpaceX and xAI—to jointly launch a constellation of 1 million satellites that would function as AI data centers in space

1

. SpaceX filed an eight-page application with the Federal Communications Commission detailing the ambitious AI infrastructure project, proposing to deposit satellites in altitudes ranging between 500km and 2000km

1

. The satellites would communicate with one another and SpaceX's Starlink constellation using laser optical links, with Starlink satellites transmitting inference requests to and from Earth

1

.

Source: Fortune

Source: Fortune

Musk isn't alone in exploring orbital computing. Alphabet CEO Sundar Pichai has said Google is exploring moonshot concepts for AI data centers in space later this decade, while former Google CEO Eric Schmidt has warned the industry is running out of electricity

2

. Even as technology companies are projected to spend more than $5 trillion globally on earth-based data centers by the end of the decade, these tech leaders argue space-based infrastructure could provide a long-term solution

2

.

The Case for AI Compute Power in Space

Musk and others argue that putting AI data centers in space makes practical sense given how much more efficient solar panels are away from Earth's atmosphere

1

. In space, there are no clouds or weather events to obscure the sun, and in the correct orbit, solar panels can collect sunlight through much of the day

1

. SpaceX has proposed putting the new constellation in sun-synchronous orbit, meaning the spacecraft would fly along the dividing line that separates the day and night sides of the planet

1

.

Source: Engadget

Source: Engadget

Musk has said that within three years space will be the cheapest way to generate AI compute power, combining declining rocket launch costs with the rising price of powering AI data centers on Earth

1

. At the World Economic Forum meeting in Davos this January, he declared: "The lowest-cost place to put AI will be in space, and that will be true within two years, maybe three at the latest"

2

.

Jeff Thornburg, CEO of Portal Space Systems and a SpaceX veteran who led development of SpaceX's Raptor engine, explained the urgency: "A lot of smart people really believe that it won't be too many years before we can't generate enough power to satisfy what we're trying to develop with AI. If that is indeed true, we have to find alternate sources of energy"

2

.

Heat Dissipation Emerges as Critical Challenge

The plan was immediately greeted with skepticism from experts who question how SpaceX would cool millions of GPUs in space

1

. While much of space is around -450 Fahrenheit, the reality is more complicated. In the near vacuum of space, the only way to dissipate heat is to slowly radiate it out, and in direct sunlight, objects can easily overheat

1

. As one commenter on Hacker News succinctly put it, "a satellite is, if nothing else, a fantastic thermos"

1

.

Scott Manley, a former software engineer who studied computational physics and astronomy, argues SpaceX has already solved that problem at a smaller scale with Starlink. He points to the company's latest V3 model, which has about 30 square meters of solar panels and successfully dissipates thermal power generated by electronics

1

. However, Kevin Hicks, a former NASA systems engineer who worked on the Curiosity rover mission, is more skeptical: "Satellites with the primary goal of processing large amounts of compute requests would generate more heat than pretty much any other type of satellite. Cooling them is another aspect of the design which is theoretically possible but would require a ton of extra work and complexity"

1

.

Cosmic Radiation and Power Generation Concerns

Cosmic radiation poses another significant obstacle for orbital computing. NASA relies on ancient hardware like the PowerPC 750 CPU found inside the Perseverance rover because older chips feature larger transistors, making them more resilient to bit flips—errors in processing caused most often by cosmic radiation

1

. Benjamin Lee, professor of computer and information science at the University of Pennsylvania, explained: "My concern about radiation is that we don't know how many bit flips will occur when you deploy the most advanced chips and hundreds of gigabytes of memory up there"

1

.

Google's Project Suncatcher explored this issue by bombarding one of its Trillium TPUs with a proton beam, finding the silicon was "surprisingly radiation-hard for space applications"

1

. Yet Professor Lee cautions we simply don't know how resilient GPUs are to radiation at the scale proposed

1

.

The power generation challenge is equally daunting. Running AI data centers in orbit would require "ginormous" solar arrays that do not yet exist, according to Thornburg

2

. Boon Ooi, a professor at Rensselaer Polytechnic Institute, put the scale into stark perspective: generating just one gigawatt of power in space would require roughly one square kilometer of solar panels. "That's extremely heavy and very expensive to launch," he said

2

.

Timeline Skepticism and Environmental Impacts

Many experts believe anything approaching meaningful scale remains decades away, especially as the bulk of AI investment continues to flow into terrestrial infrastructure, including Musk's own Colossus supercomputer in Memphis, which analysts estimate will cost tens of billions of dollars

2

. Kathleen Curlee, a research analyst at Georgetown University's Center for Security and Emerging Technology, stated bluntly: "We're being told the timeline for this is 2030, 2035—and I really don't think that's possible"

2

.

Beyond technical feasibility, experts warn that executing this AI infrastructure project at the scale suggested could have devastating effects on the environment and the sustainability of low Earth orbit

1

. The launch logistics alone raise concerns about space debris and the long-term viability of orbital operations. While the AI boom continues to drive demand for more computing power, constraints around heat dissipation, launch logistics, and cost make space a poor substitute for earth-based data centers anytime soon

2

.

Thornburg acknowledged the hurdles are formidable even if the underlying physics are sound: "We know how to launch rockets; we know how to put spacecraft into orbit; and we know how to build solar arrays to generate power. But feasibility does not mean being able to build at speed or scale. I think it's always a question of how long it will take"

2

. The renewed interest in orbital computing reflects mounting pressure on the industry to find ways around the physical limits of earth-based infrastructure, including strained power grids, rising electricity costs, and environmental concerns—but whether Starship and other advances can overcome these obstacles remains an open question.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo