7 Sources
7 Sources
[1]
Four things we'd need to put data centers in space
SpaceX is the latest in a string of high-tech companies extolling the potential of orbital computing infrastructure. Last year, Amazon founder Jeff Bezos said that the tech industry will move toward large-scale computing in space. Google has plans to loft data-crunching satellites, aiming to launch a test constellation of 80 as early as next year. And last November Starcloud, a startup based in Washington State, launched a satellite fitted with a high-performance Nvidia H100 GPU, marking the first orbital test of an advanced AI chip. The company envisions orbiting data centers as large as those on Earth by 2030. Proponents believe that putting data centers in space makes sense. The current AI boom is straining energy grids and adding to the demand for water, which is needed to cool the computers. Communities in the vicinity of large-scale data centers worry about increasing prices for those resources as a result of the growing demand, among other issues. In space, advocates say, the water and energy problems would be solved. In constantly illuminated sun-synchronous orbits, space-borne data centers would have uninterrupted access to solar power. At the same time, the excess heat they produce would be easily expelled into the cold vacuum of space. And with the cost of space launches decreasing, and mega-rockets such as SpaceX's Starship promising to push prices even lower, there could be a point at which moving the world's data centers into space makes sound business sense. Detractors, on the other hand, tell a different story and point to a variety of technological hurdles, though some say it's possible they may be surmountable in the not-so-distant future. Here are four of the must-haves we'd need to make space-based data centers a reality. AI data centers produce a lot of heat. Space might seem like a great place to dispel that heat without using up massive amounts of water. But it's not so simple. To get the power needed to run 24-7, a space-based data center would have to be in a constantly illuminated orbit, circling the planet from pole to pole, and never hide in Earth's shadow. And in that orbit, the temperature of the equipment would never drop below 80 °C, which is way too hot for electronics to operate safely in the long term. Getting the heat out of such a system is surprisingly challenging. "Thermal management and cooling in space is generally a huge problem," says Lilly Eichinger, CEO of the Austrian space tech startup Satellives.
[2]
Musk wants SpaceX IPO to fund AI space data centers. Microsoft's undersea setback sounds warning.
LOS ANGELES, April 1 (Reuters) - SpaceX on Wednesday filed for an IPO that Elon Musk says will bankroll an effort to turn the rocket maker into an AI powerhouse, launching up to 1 million data‑center satellites into orbit to bypass power and water limits on Earth. Microsoft (MSFT.O), opens new tab had a similar ambition to escape land‑based computing constraints in 2015, when it lowered a shipping‑container‑sized data center onto the seabed off Scotland, aiming to cut energy use through natural seawater cooling and tapping offshore wind and tidal power. Microsoft's "Project Natick,", opens new tab once touted as a potential breakthrough for the data‑center industry, successfully met all its technical targets but underwater data centers were abandoned more than two years ago due to a lack of client demand and unviable economics, two sources with knowledge of the project told Reuters. Asked for comment, a Microsoft spokesperson said: "While we don't currently have datacenters in the water, we will continue to use Project Natick as a research platform to explore, test, and validate new concepts around datacenter reliability and sustainability." Five data center specialists told Reuters that what went wrong for Microsoft is a cautionary tale for SpaceX because although both projects are a world apart geographically, they share key similarities: they both rely on modular units that are expensive to deploy and cannot be expanded, repaired or upgraded - features considered critical by the AI industry. "These problems are likely to be more severe in space than under the sea," said Roy Chua, founder of industry research firm AvidThink, pointing to unresolved questions over how to cool data centers in orbit, high rocket launch costs and the effects of the harsh space environment on AI chips. SpaceX did not respond to a request for comment. SpaceX, which acquired Musk's AI startup xAI in February, could raise up to $75 billion when it goes public, making it potentially the largest IPO in history. The holdings of xAI include social media company X, formerly Twitter, and AI chatbot Grok. MUSK'S SPACE AMBITIONS FACE HURDLES Although Microsoft proved that undersea data centers could work, customers were not interested in scaling them, instead expanding conventional land‑based facilities that allowed cheaper, faster upgrades as AI development accelerated, the two people with knowledge of the project said, asking not to be named due to the sensitivity of the matter. The sealed, "locked‑for‑life" design - which SpaceX would replicate in orbit - has limited flexibility, since AI chips are rapidly improving every year, while a satellite or undersea data center might be replaced only every five to seven years. The economics were also a stumbling block, the two people said. Deploying data centers under the sea was more expensive than building on land, and while those costs might have fallen at scale, doing so would have required tens of billions of dollars in investment. Space will be far more expensive. Analysts at MoffettNathanson, an independent U.S. equity research firm, said in a February research note that Musk's plan to put a million AI satellites in space would run into the trillions of dollars. In order for data centers in space to become commercially viable, launch costs would need to fall from today's low thousands of dollars per kilogram to the low hundreds of dollars per kilogram, analysts say. "The problem is not whether something can work, but whether it makes sense economically versus simply building more capacity on the ground," said Tim Farrar, an independent satellite industry analyst at TMF Associates. Musk says he will overcome the technical and financial hurdles, including radiation exposure, heat management in a vacuum and the need for frequent hardware replacement, by sharply lowering launch costs and developing more resilient AI chips. Demand will not be an issue, Musk says, because Earth's energy resources will quickly be depleted as AI is needed to support a world where robots outnumber humans, all cars drive themselves and space travel becomes routine. "The idea that we just can't solve problems on Earth, like power shortages and environmental issues, strikes me as unrealistically negative about Earth to try and make everything seem better in space," Farrar said. Musk's case hinges on Starship, SpaceX's next‑generation rocket, which is designed to be fully reusable and carry far larger payloads than SpaceX's Falcon rockets. But Starship is years behind schedule and has suffered explosive setbacks in some of its 11 suborbital test flights since 2023. MoffettNathanson estimates that to achieve Musk's goal it would require 3,000 Starship launches a year, or eight per day. Jeff Bezos' space company Blue Origin is also backing orbital data centers. The rocket company said in March that its Project Sunrise concept would add AI computing capacity in orbit, tapping clean solar power while preserving terrestrial data‑center infrastructure. Blue Origin did not respond to a request for further comment. SPACE AI COULD BE NICHE BUSINESS Space data centers do have a future, but it is more likely to complement ground-based data centers, said Claude Rousseau, a research director at Analysys Mason who tracks satellite markets. "I strongly believe that there'll be no way in the foreseeable future that space‑based data centers can replace ground data centers," Rousseau said, adding that it would be a more niche industry serving infrastructure in orbit, like military satellite constellations and space stations. For instance, the International Space Station already hosts experimental systems designed to process data in orbit and reduce reliance on downlink bandwidth. Speaking on the All‑In podcast in February, Nvidia (NVDA.O), opens new tab Chief Executive Jensen Huang said the economics of space‑based AI data centers remain unattractive. "We should definitely work on the ground first because we're already here," Huang said, describing orbital AI infrastructure as a longer‑term engineering challenge rather than a near‑term solution. Chua said schemes to move data centers under the sea or into space risk trying to escape problems on Earth and creating a whole new set of harder challenges. "There are many problems that we can solve on Earth before space," Chua said, pointing to gains in AI chip efficiency, better water recycling, and expanded use of solar power and modular nuclear power generation. Reporting by Joe Brock in Los Angeles; Editing by Matthew Lewis Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence * Grid & Infrastructure Joe Brock Thomson Reuters Joe Brock is Reuters' aerospace and defense editor, based in Los Angeles, where he leads a global team of reporters covering airlines, aerospace, weapons manufacturers, and the space industry. Joe has previously worked in Singapore, Johannesburg, Abuja and London as a reporter and bureau chief. He has received several awards for his investigative journalism, including from the Society for Advancing Business Editing and Writing and The Society of Publishers in Asia.
[3]
Space data centres: SpaceX and Blue Origin race to orbit while scientists question the physics
The pitch is seductive in its simplicity: AI needs more power than terrestrial grids can supply, so move the data centres into orbit, where the sun never sets and the electricity is free. SpaceX, Blue Origin, and a growing constellation of startups are now racing to make that vision real. The problem, according to the scientists and engineers who would have to make the physics work, is that the vision skips several chapters of thermodynamics, economics, and orbital mechanics that have not yet been written. SpaceX filed with the Federal Communications Commission on 30 January for permission to launch up to one million satellites into low Earth orbit, each carrying computing hardware that would collectively form what the company described as a constellation with "unprecedented computing capacity to power advanced artificial intelligence models." The satellites would operate at altitudes between 500 and 2,000 kilometres, in orbits designed to maximise time in sunlight, and route traffic through SpaceX's existing Starlink network. SpaceX requested a waiver of the FCC's standard deployment milestones, which typically require half a constellation to be operational within six years. Seven weeks later, Blue Origin filed its own application. Project Sunrise proposes 51,600 satellites in sun-synchronous orbits between 500 and 1,800 kilometres, complemented by the previously announced TeraWave constellation of 5,408 satellites providing ultra-high-speed optical backhaul. Where SpaceX's filing emphasised raw scale, Blue Origin's emphasised architecture: the system would perform computation in orbit and relay results to the ground through TeraWave's mesh network. The startup ecosystem is moving even faster. Starcloud, formerly Lumen Orbit, raised $170 million at a $1.1 billion valuation in March, becoming the fastest unicorn in Y Combinator history just 17 months after completing the programme. The company launched its first satellite carrying an Nvidia H100 GPU in November 2025 and filed with the FCC in February for a constellation of up to 88,000 satellites. Aethero, a defence-focused startup building space-grade computers with Nvidia Orin NX chips wrapped in radiation shielding, raised $8.4 million and is testing hardware on orbit this year. The commercial logic rests on a genuine problem. Global data centre electricity consumption reached roughly 415 terawatt-hours in 2024 and the International Energy Agency projects it could exceed 1,000 TWh by 2026, with accelerated AI servers driving 30 per cent annual growth. In Virginia alone, data centres consume 26 per cent of total electricity supply. Ireland's share could reach 32 per cent by year's end. The grid constraints are real, the permitting delays are real, and the political resistance to building more terrestrial capacity is real. What is also real, scientists argue, is the physics that makes orbital computing spectacularly difficult at any meaningful scale. The most fundamental challenge is heat. In space, there is no air to carry heat away from processors, only radiative cooling, which requires vast surface areas. Dissipating just one megawatt of thermal energy while keeping electronics at a stable 20 degrees Celsius demands approximately 1,200 square metres of radiator, roughly four tennis courts. A several-hundred-megawatt data centre, the minimum threshold for commercial relevance, would require radiators thousands of times larger than anything ever deployed on the International Space Station. Radiation presents the second structural problem. Low Earth orbit exposes unshielded chips to cosmic rays and trapped particles that induce bit flips and permanent circuit damage. Radiation hardening adds 30 to 50 per cent to hardware costs and reduces performance by 20 to 30 per cent. The alternative, triple modular redundancy, means launching three copies of every chip, three times the cooling, three times the electricity, and three times the mass. Starcloud's approach of flying commercial GPUs with external shielding is an interesting experiment, but no one has demonstrated that it works at scale or over hardware lifetimes measured in years rather than months. Latency is the third constraint. A million satellites spread across orbital shells from 500 to 2,000 kilometres cannot achieve the tight coupling required for frontier model training, where inter-node communication latencies must remain in the microsecond range. Low Earth orbit introduces minimum latencies of several milliseconds for inter-satellite links and 60 to 190 milliseconds for ground-to-orbit round trips, compared to 10 to 50 milliseconds for terrestrial content delivery networks. That makes orbital infrastructure potentially viable for inference workloads, not for training, which is where the overwhelming majority of AI compute demand currently sits. Then there is cost. IEEE Spectrum estimated that a one-gigawatt orbital data centre would cost upwards of $50 billion, roughly three times the cost of an equivalent terrestrial facility including five years of operation. Google has said that launch costs must fall to under $200 per kilogram before space-based computing begins to make economic sense. SpaceX's current Starlink economics operate at roughly $1,000 to $2,000 per kilogram. Some analysts argue the true threshold for competing with terrestrial refresh economics is $20 to $30 per kilogram, a figure no credible projection places within the next two decades. The economics look even less favourable when set against the deep-tech funding landscape on the ground, where terrestrial infrastructure projects can draw on established supply chains and proven unit economics. Even OpenAI's Sam Altman, who explored a multibillion-dollar investment in rocket maker Stoke Space as a potential SpaceX competitor for orbital data centres, has publicly called the concept "ridiculous" for the current decade. Altman told journalists that the rough maths of launch costs relative to terrestrial power costs simply does not work yet, and he pointedly asked how anyone plans to fix a broken GPU in space. The astronomical community adds a separate objection entirely. The vast majority of the roughly 1,000 public comments on SpaceX's FCC filing urged the commission not to proceed. If approved, the constellation would place more satellites than visible stars in the sky for large portions of the night throughout the year, further militarising and commercialising an orbital environment that is already straining under the weight of existing megaconstellations. None of this means orbital data centres will never exist. SpaceX's Starship, if it achieves its cost targets, could fundamentally change the mass-to-orbit economics that currently make the concept unworkable. Starcloud's incremental approach of flying small payloads and iterating on radiation performance is the kind of engineering pathway that occasionally produces breakthroughs. And the terrestrial grid constraints driving the interest are not going away. But the gap between filing an FCC application for a million satellites and actually making orbital computation economically competitive with a warehouse full of GPUs in Iowa is not measured in years. It is measured in physics problems that the current pace of AI infrastructure investment cannot shortcut, no matter how many billionaires are willing to try. The question scientists are asking is not whether space data centres are theoretically possible. It is why, given the magnitude of the unsolved engineering, anyone is treating them as a near-term solution to a problem that requires near-term answers. The sky, it turns out, is not the limit. The radiator is.
[4]
Big tech's next move is to put data centers in space. Can it work?
Standing before a friendly crowd in March, Elon Musk laid out his plan for the future of his companies, and it was literally out of this world. Musk announced that his space-launch company, SpaceX, which had recently merged with his artificial intelligence company, xAI, would put data centers into orbit around the Earth. It all comes down to electricity, he explained. "You're power constrained on Earth," he said. "Space has the advantage that it's always sunny." Musk envisions legions of data-crunching satellites spinning around the planet, powering the AI revolution from above. It's the perfect pitch for taking SpaceX public. This week, Bloomberg reported that the company had filed documents confidentially to the Securities and Exchange Commission with the goal of going public this summer. Musk also claims it makes financial sense. "I actually think that the cost of deploying AI in space will drop below the cost of terrestrial AI much sooner than most people expect," he said. "I think it may be only two or three years." Others are skeptical. Musk's timeline is "an optimistic interpretation," according to Brandon Lucia, a professor of electrical and computer engineering at Carnegie Mellon University who specializes in putting computers on satellites. The napkin math looks appealing, and power is free up there after all -- but it turns out there are a lot of obstacles to building a data center among the stars. Here on Earth, the problem is glaring: AI is gobbling up electricity around the globe. Global data-center power consumption is expected to roughly double to nearly 1,000 terawatt-hours by the end of the decade, according to an estimate by the International Energy Agency. To fill the gap, some companies are building dedicated gas turbines, while others are investing in nuclear technology. It's not enough, according to Philip Johnston, CEO and co-founder of Starcloud, which is seeking to build orbital data centers. "We're very quickly running up on constraints on where you can build new energy projects terrestrially," Johnston said. "Within six months, they'll just be leaving chips in warehouses because they don't have power for turning them on." Starcloud launched its first spacecraft last fall with an Nvidia H100 chip on board. The company demonstrated the ability to run a version of Google's Gemini AI from space, and it plans to launch a second spacecraft in October. "That one has 100 times the power generation of the first one," Johnston said, though it's still expected to generate only around 8 kilowatts of power. Google is also pursuing the idea of building data centers in space through a project known as Suncatcher. It envisions an 81-satellite cluster that it plans to build in partnership with the satellite-imagery company Planet. Two prototype satellites will launch in early 2027, according to the companies. "Orbital data centers are an idea whose time has come," Will Marshall, Planet's CEO, wrote to NPR in an email. "When exactly it will be more cost efficient than terrestrial ones is debatable but now is the time to be working on this." To go from a handful of prototype satellites to something useful is not so easy. For one thing, the power requirements of the microchips used for artificial intelligence are enormous. To get a sense of just how much power is needed, consider the largest power-producing facility in space right now: the International Space Station (ISS). The solar panels of the ISS are around half the size of a football field and produce around 100 kilowatts of average power, according to Olivier de Weck, a professor of astronautics at the Massachusetts Institute of Technology. "It's basically the amount of power that a single big car engine produces." To replicate a 100-megawatt data center in space would require a facility that's 500 to 1,000 times, depending on the orbit. "Is that feasible? Yeah, I think it's feasible, but not next year and certainly not in three years," he said. And power is not the only requirement; the satellites also have to provide cooling to the microchips. While it's true that space is cold, it's also a vacuum. This means that when a satellite gets hot, there's no easy way to get rid of that heat -- it just builds up. "All of that heat that the computer generates has to be dispelled," said Rebekah Reed, a former NASA official now at Harvard University's Belfer Center for Science and International Affairs. The best solution is radiators, which move liquids out to giant panels where the heat can be dissipated. So in addition to solar panels, an AI satellite would need another set of large radiators. "When you put those massive radiators together with massive solar arrays that are required in order to power and cool, you're actually talking about really large satellites, or very, very large satellite constellations," Reed said. An alternative is to build smaller satellites and fly them in preset formations called constellations. Such constellations allow the heat and power problems to be distributed, but to work, the satellites would need to send huge amounts of data back and forth. That likely means using lasers to beam data between satellites. But even moving at the speed of light, the time it takes to get data from one satellite to another is long enough to slow down computing. Google's Project Suncatcher proposes flying groupings of satellites in extremely tight clusters to reduce that latency. Musk, meanwhile, has proposed launching upward of a million satellites and placing them in orbit around Earth's poles. He recently unveiled the first generation "AI Sat Mini" spacecraft -- with solar arrays spanning roughly 180 meters (about 600 feet) -- during his presentation. Launching all that into space would cost money -- lots of money. At the moment, it can cost around $1,000 per kilogram to launch a satellite into orbit. Google believes that cost must drop by at least a factor of five to $200 per kilogram before data centers in space will begin to make sense. Musk thinks he can do it with his new Starship rocket, which is still in development. Starcloud's Johnston says Starship is central to more than just SpaceX's vision. He told investors: "If you don't think Starship's going to work, don't invest in us -- that's totally fine." Even if a company could get a data center into space, running it would involve a lot more than just moving microchips into orbit. Data centers on Earth are not just static buildings full of chips humming away, says Raul Martynek, the CEO of DataBank, a company that maintains 75 data centers, primarily located in the United States. They require constant maintenance and upgrades, all of which is done by workers. Take DataBank's IAD1 data center in Ashburn, Virginia. The facility is 144,000 square feet filled with rows and rows of black computer cabinets, which are filled with microprocessors. It's fairly run-of-the-mill, as these facilities go, but it still consumes around 13 megawatts of power at any given moment (that's 130 times more than the International Space Station). "We have vendors here every single day," says James Mathes, who manages IAD1. Workers are constantly in and out of these data centers, installing new servers, upgrading microchips and fixing things. And to stay competitive, space data centers would need to do much of the same. Some of that could be done through software, and Musk points out that chips can be rigorously tested on the ground before they're sent aloft. But the fact remains that the companies that rent data centers often want to access them physically for one reason or another. Martynek, who has spent decades in telecom, says he's not worried about space data centers taking business from his company. "It seems like there's a lot of ifs and a lot of advancements that would have to occur, and I find it kind of hard to believe that all that could happen in two or three years," he said. "No one in data center land is losing any sleep."
[5]
Google CEO Sundar Pichai says we're just a decade away from a new normal of extraterrestrial data centers | Fortune
Google's "moonshot" aspirations to expand its AI footprint are taking on a more literal meaning. CEO Sundar Pichai said in a Fox News interview in December Google will soon begin construction of AI data centers in space. The tech giant announced Project Suncatcher late last year, with the goal of finding more efficient ways to power energy-guzzling centers, in this case with solar power. "One of our moonshots is to, how do we one day have data centers in space so that we can better harness the energy from the sun that is 100 trillion times more energy than what we produce on all of Earth today?" Pichai said. Google will take its first steps in constructing extraterrestrial data centers in early 2027 in partnership with satellite imagery firm Planet, launching two pilot satellites to test the hardware in Earth's orbit. According to Pichai, space-based data centers will be the new standard in the near future. "But there's no doubt to me that a decade or so away we'll be viewing it as a more normal way to build data centers," he said. Google isn't the only company looking to the skies for an answer to improving data center efficiency. Earlier this year, SpaceX sought permission to launch as many as 1 million satellites into Earth's orbit, part of a bigger goal of launching a solar-powered satellite network to "accommodate the explosive growth of data demands driven by AI," according to a filing with the Federal Communications Commission. In December 2025, Y Combinator and Nvidia-backed startup Starcloud sent its first AI-equipped satellite to space. CEO and cofounder Philip Johnston predicted extraterrestrial data centers will produce 10 times lower carbon emissions than their earthbound counterparts, even taking into account the emissions from launch. While the cost of satellites used to test AI hardware in space has decreased drastically, putting extraterrestrial data center development within reach, the cost of building these solar-powered centers is still an unknown, particularly as earthbound data centers are expected to require more than $5 trillion in capital expenditures by 2030, according to an April 2025 McKinsey report. Google, which catapulted itself back into the AI front-runner conversation with the recent release of Gemini 3, is one of several major hyperscalers pouring money into data centers to expand its computing capabilities. Alphabet, Google's parent company, said in February it would spend $175 billion to $185 billion this year in capital expenditures, primarily to build out AI infrastructure. All the while, speculation of an AI bubble threatens to create an oversupply of data centers, which could render the data center space race a dangerous over-investment. Moreover, with the technology quickly developing, there's a risk data centers under construction now could have out-of-date equipment by the time they are completed. Hyperscalers, including Alphabet, are taking an even greater risk by financing their AI buildouts with debt. In 2025, Alphabet, Amazon, Oracle, Meta and Microsoft issued $121 billion in new debt through bonds. That's compared to $40 billion in new debt in 2020. "The stakes are high," the McKinsey report said. "Overinvesting in data center infrastructure risks stranding assets, while underinvesting means falling behind." Harnessing solar energy to power data centers has become increasingly appealing amid growing concerns about the sustainability of expanding AI compute, which requires an exorbitant amount of power. A December 2024 U.S. Department of Energy report on domestic data center usage found data center load has tripled in the past 10 years and may double or triple again by 2028. These data centers consumed more than 4% of the country's electricity in 2023, and are predicted to consume up to 12% of U.S. electricity by 2028, according to the report. Google alone has more than doubled its electricity consumption on data center use in the past five years, using 30.8 million megawatt-hours of electricity last year compared to 14.4 million in 2020, when it began specifically tracking data center energy consumption, according to its latest sustainability report released in June 2025. Google has worked to reduce the energy needed to power its growing data centers, reporting it reduced its data center energy emissions by 12% in 2024, despite an increasing footprint. However, concerns about the feasability and timeline of extraterrestrial data center expansion remain. Amazon Web Services CEO Matt Garman poured cold water on data centers in space at a tech conference in San Francisco in February: "I don't know if you've seen a rack of servers lately: They're heavy. And last I checked, humanity has yet to build a permanent structure in space. So ... maybe." Others have warned about future sustainability concerns of an AI buildout expanding beyond Earth, indicating the AI space race don't happen for decades. "There is still much we don't know about the environmental impact of AI, but some of the data we do have is concerning," Golestan Radwan, United Nations Environment Programme chief digital officer, said in a 2024 statement following the program's note warning of the environmental impact of AI infrastructure expansion. "We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale."
[6]
Datacenters in space - Is this technologically viable? By Investing.com
Investing.com -- As the global race for artificial intelligence computing power collides with a mounting energy crisis on Earth, the concept of "datacenters in space" is moving from science fiction to serious analyst debate. A new report from Bernstein explores whether orbiting servers could solve the massive power and land-use constraints facing terrestrial datacenters, or if the laws of physics and launch economics remain insurmountable hurdles. The orbital opportunity: abundant power and no NIMBYs The primary allure of space-based compute is the "easy availability" of solar energy. In Low Earth Orbit (LEO), power is abundant and free from the regulatory and "Not In My Backyard" (NIMBY) hurdles that have slowed datacenter construction in traditional hubs. Terrestrial grids are under strain from the Iran conflict and rising energy costs; hence, the prospect of a "power-independent" AI cluster is increasingly attractive to hyperscalers. Bernstein analysts point out that while power is "free" once you are in orbit, the cost of getting there remains a significant challenge. The viability of the space projects depends heavily on whether the lifetime compute value of a satellite can offset the exorbitant upfront costs of launch and specialized hardware. The cooling constraint: physics vs. ambition The most significant technical challenge identified in the report is thermal management. On Earth, datacenters rely on liquid cooling or airflow to dissipate the immense heat generated by AI GPUs. In the vacuum of space, heat can only be removed via radiation, which is a far less efficient process. "The question of viability remains," the report notes, specifically whether a satellite can house enough computing power to be useful while remaining small enough to launch and cool effectively. Space-based datacenters will become "technologically viable" after scientists achieve massive advancements in radiative cooling and miniaturized power systems. Without these, the idea risks remaining a "pie in the sky" concept rather than a functional solution to the AI power crunch.
[7]
SpaceX's orbital data centers could face same hurdles as Microsoft's abandoned undersea project
LOS ANGELES, April 1 (Reuters) - SpaceX on Wednesday filed for an IPO that Elon Musk says will bankroll an effort to turn the rocket maker into an AI powerhouse, launching up to 1 million data-center satellites into orbit to bypass power and water limits on Earth. Microsoft had a similar ambition to escape land-based computing constraints in 2015, when it lowered a shipping-container-sized data center onto the seabed off Scotland, aiming to cut energy use through natural seawater cooling and tapping offshore wind and tidal power. Microsoft's "Project Natick," once touted as a potential breakthrough for the data-center industry, successfully met all its technical targets but underwater data centers were abandoned more than two years ago due to a lack of client demand and unviable economics, two sources with knowledge of the project told Reuters. Asked for comment, a Microsoft spokesperson said: "While we don't currently have datacenters in the water, we will continue to use Project Natick as a research platform to explore, test, and validate new concepts around datacenter reliability and sustainability." Five data center specialists told Reuters that what went wrong for Microsoft is a cautionary tale for SpaceX because although both projects are a world apart geographically, they share key similarities: they both rely on modular units that are expensive to deploy and cannot be expanded, repaired or upgraded - features considered critical by the AI industry. "These problems are likely to be more severe in space than under the sea," said Roy Chua, founder of industry research firm AvidThink, pointing to unresolved questions over how to cool data centers in orbit, high rocket launch costs and the effects of the harsh space environment on AI chips. SpaceX did not respond to a request for comment. SpaceX, which acquired Musk's AI startup xAI in February, could raise up to $75 billion when it goes public, making it potentially the largest IPO in history. The holdings of xAI include social media company X, formerly Twitter, and AI chatbot Grok. MUSK'S SPACE AMBITIONS FACE HURDLES Although Microsoft proved that undersea data centers could work, customers were not interested in scaling them, instead expanding conventional land-based facilities that allowed cheaper, faster upgrades as AI development accelerated, the two people with knowledge of the project said, asking not to be named due to the sensitivity of the matter. The sealed, "locked-for-life" design - which SpaceX would replicate in orbit - has limited flexibility, since AI chips are rapidly improving every year, while a satellite or undersea data center might be replaced only every five to seven years. The economics were also a stumbling block, the two people said. Deploying data centers under the sea was more expensive than building on land, and while those costs might have fallen at scale, doing so would have required tens of billions of dollars in investment. Space will be far more expensive. Analysts at MoffettNathanson, an independent U.S. equity research firm, said in a February research note that Musk's plan to put a million AI satellites in space would run into the trillions of dollars. In order for data centers in space to become commercially viable, launch costs would need to fall from today's low thousands of dollars per kilogram to the low hundreds of dollars per kilogram, analysts say. "The problem is not whether something can work, but whether it makes sense economically versus simply building more capacity on the ground," said Tim Farrar, an independent satellite industry analyst at TMF Associates. Musk says he will overcome the technical and financial hurdles, including radiation exposure, heat management in a vacuum and the need for frequent hardware replacement, by sharply lowering launch costs and developing more resilient AI chips. Demand will not be an issue, Musk says, because Earth's energy resources will quickly be depleted as AI is needed to support a world where robots outnumber humans, all cars drive themselves and space travel becomes routine. "The idea that we just can't solve problems on Earth, like power shortages and environmental issues, strikes me as unrealistically negative about Earth to try and make everything seem better in space," Farrar said. Musk's case hinges on Starship, SpaceX's next-generation rocket, which is designed to be fully reusable and carry far larger payloads than SpaceX's Falcon rockets. But Starship is years behind schedule and has suffered explosive setbacks in some of its 11 suborbital test flights since 2023. MoffettNathanson estimates that to achieve Musk's goal it would require 3,000 Starship launches a year, or eight per day. Jeff Bezos' space company Blue Origin is also backing orbital data centers. The rocket company said in March that its Project Sunrise concept would add AI computing capacity in orbit, tapping clean solar power while preserving terrestrial data-center infrastructure. Blue Origin did not respond to a request for further comment. SPACE AI COULD BE NICHE BUSINESS Space data centers do have a future, but it is more likely to complement ground-based data centers, said Claude Rousseau, a research director at Analysys Mason who tracks satellite markets. "I strongly believe that there'll be no way in the foreseeable future that space-based data centers can replace ground data centers," Rousseau said, adding that it would be a more niche industry serving infrastructure in orbit, like military satellite constellations and space stations. For instance, the International Space Station already hosts experimental systems designed to process data in orbit and reduce reliance on downlink bandwidth. Speaking on the All-In podcast in February, Nvidia Chief Executive Jensen Huang said the economics of space-based AI data centers remain unattractive. "We should definitely work on the ground first because we're already here," Huang said, describing orbital AI infrastructure as a longer-term engineering challenge rather than a near-term solution. Chua said schemes to move data centers under the sea or into space risk trying to escape problems on Earth and creating a whole new set of harder challenges. "There are many problems that we can solve on Earth before space," Chua said, pointing to gains in AI chip efficiency, better water recycling, and expanded use of solar power and modular nuclear power generation. (Reporting by Joe Brock in Los Angeles; Editing by Matthew Lewis)
Share
Share
Copy Link
SpaceX filed to launch up to 1 million satellites for orbital computing, while Google plans its Project Suncatcher test in 2027. The companies promise to solve AI's escalating electricity demands with solar-powered space data centers, but scientists warn of unresolved challenges in thermal management, radiation shielding, and economics that could make the vision unviable for years.
SpaceX filed with the Federal Communications Commission on January 30 for permission to launch up to 1 million satellites into low Earth orbit, each carrying computing hardware that would collectively form what the company described as a constellation with "unprecedented computing capacity to power advanced artificial intelligence models"
3
. The satellites would operate at altitudes between 500 and 2,000 kilometers, in orbits designed to maximize time in sunlight, and route traffic through SpaceX's existing Starlink network3
. Elon Musk announced in March that SpaceX, which recently merged with his AI company xAI, would pursue placing data centers in space to bypass power constraints on Earth4
. SpaceX filed for an IPO that could raise up to $75 billion, potentially making it the largest IPO in history, with proceeds intended to bankroll the orbital computing effort2
.
Source: NPR
Blue Origin followed seven weeks later with its own application for Project Sunrise, proposing 51,600 satellites in sun-synchronous orbits between 500 and 1,800 kilometers, complemented by the TeraWave constellation of 5,408 satellites providing ultra-high-speed optical backhaul
3
. Google announced Project Suncatcher in partnership with satellite imagery firm Planet, planning to launch two pilot satellites in early 2027 to test hardware in Earth's orbit5
. CEO Sundar Pichai stated that "a decade or so away we'll be viewing it as a more normal way to build data centers"5
.The commercial logic for extraterrestrial data centers rests on a genuine problem with AI's escalating electricity demands. Global data center electricity consumption reached roughly 415 terawatt-hours in 2024, and the International Energy Agency projects it could exceed 1,000 TWh by 2026, with accelerated AI servers driving 30 percent annual growth
3
. In Virginia alone, data centers consume 26 percent of total electricity supply, while Ireland's share could reach 32 percent by year's end3
. Google alone has more than doubled its electricity consumption on data center use in the past five years, using 30.8 million megawatt-hours of electricity last year compared to 14.4 million in 20205
.
Source: MIT Tech Review
Proponents believe that in constantly illuminated sun-synchronous orbits, space-borne data centers would have uninterrupted access to solar power, while excess heat would be easily expelled into the cold vacuum of space
1
. Starcloud CEO Philip Johnston predicted extraterrestrial data centers will produce 10 times lower carbon emissions than their earthbound counterparts5
. The startup raised $170 million at a $1.1 billion valuation in March, becoming the fastest unicorn in Y Combinator history just 17 months after completing the program3
. Starcloud launched its first satellite carrying an Nvidia H100 GPU in November 2025, marking the first orbital test of an advanced AI chip1
.Scientists argue the physics makes orbital computing spectacularly difficult at any meaningful scale. The most fundamental challenge is cooling. In space, there is no air to carry heat away from processors, only radiative cooling, which requires vast surface areas
3
. Dissipating just one megawatt of thermal energy while keeping electronics at a stable 20 degrees Celsius demands approximately 1,200 square meters of radiator, roughly four tennis courts3
. "Thermal management and cooling in space is generally a huge problem," said Lilly Eichinger, CEO of the Austrian space tech startup Satellives1
.Radiation presents the second structural problem. Low Earth orbit exposes unshielded chips to cosmic rays and trapped particles that induce bit flips and permanent circuit damage
3
. Radiation hardening adds 30 to 50 percent to hardware costs and reduces performance by 20 to 30 percent, while the alternative of triple modular redundancy means launching three copies of every chip3
. Latency is the third constraint, as a million satellites spread across orbital shells cannot achieve the tight coupling required for frontier model training, where inter-node communication latencies must remain in the microsecond range3
.Related Stories
Microsoft's "Project Natick" lowered a shipping-container-sized data center onto the seabed off Scotland in 2015, aiming to cut energy use through natural seawater cooling and tapping offshore wind and tidal power
2
. While the project successfully met all its technical targets, underwater data centers were abandoned more than two years ago due to a lack of client demand and unviable economics2
. Customers were not interested in scaling them, instead expanding conventional land-based facilities that allowed cheaper, faster upgrades as AI development accelerated2
. The sealed, "locked-for-life" design has limited flexibility, since AI chips are rapidly improving every year, while a satellite or undersea data center might be replaced only every five to seven years2
.Analysts at MoffettNathanson estimated that Musk's plan to put a million AI satellites in space would run into the trillions of dollars
2
. For sustainable data infrastructure to become commercially viable in orbit, launch costs would need to fall from today's low thousands of dollars per kilogram to the low hundreds of dollars per kilogram2
. Musk's case hinges on Starship, SpaceX's next-generation rocket, which is designed to be fully reusable and carry far larger payloads, but Starship is years behind schedule and has suffered explosive setbacks in some of its 11 suborbital test flights since 20232
. MoffettNathanson estimates achieving Musk's goal would require 3,000 Starship launches a year, or eight per day2
. IEEE Spectrum estimated that a one-gigawatt orbital data center would cost upwards of $50 billion3
. Meanwhile, earthbound data centers are expected to require more than $5 trillion in capital expenditures by 2030, with hyperscalers including Alphabet, Amazon, Oracle, Meta and Microsoft issuing $121 billion in new debt through bonds in 2025 alone5
.
Source: Reuters
Summarized by
Navi
[1]
[2]
[3]
11 Dec 2025•Technology

04 Nov 2025•Technology

02 Jan 2026•Technology

1
Policy and Regulation

2
Technology

3
Technology
