11 Sources
11 Sources
[1]
Could AI Data Centers Be Moved to Outer Space?
Data centers are being built at a frantic pace all over the world, driven by the AI boom. These facilities consume staggering amounts of electricity. By 2028, AI servers alone may use as much energy as 22 percent of US households. Of course that demand will raise energy prices for everyone, and we'll need more power plants, which means more global warming. Then there's the water problem. High-density AI chips run so hot that air cooling isn't enough. New facilities are turning to water cooling. The technique of choice is water evaporation. It's more effective and energy-efficient than recirculating water, but a large data center using this method consumes millions of gallons of water a day, draining local water supplies. So it's no surprise that more and more towns are pushing back on data center projects in their area. But if everyone goes NIMBY, it gets sort of NOMPY -- like "not on my planet, you bastards." What to do? People aren't going to stop using AI. That's why some folks are saying we should build data centers in space. Just think: You could get 24/7 energy from solar panels -- it's always sunny in space -- and the thermal stuff wouldn't be an issue because it's so cold out there. You could do the heavy processing in orbiting data centers and beam the results back to Earth just like satellite internet. That's the claim, anyway. Could this really work? Or is it about as practical as colonizing Mars? I asked Google's AI Overview, and it said, "Yes, data centers can be built in space." But of course it would say that. I think we'll have to go full renegade and dial up some old-fashioned human intelligence on this. One of the really big ideas in science is called conservation of energy. This says that for any "system" (any collection of things we pick), the total energy going into the system equals the change in energy of that system plus the energy going out of the system: Or rearranging, any change in the amount of energy in a system equals the difference of energy inputs and outputs. What this says is that energy can't be created or destroyed, only transformed from one form to another -- like solar panels convert light energy to electric energy. Energy is measured in joules, but it's often easier to talk about power instead. Power is the change in energy (ΔE) per unit of time (Δt), so it is measured in joules per second, also known as watts. In terms of power, conservation of energy says the power into a system equals the power out of the system plus the power of the change in internal energy. For example, say the "system" is a desktop PC with a 300-watt power supply. That means the maximum power input is 300 watts. What about the energy changes in the system? Well, it gets hot, so there's an increase in thermal energy. But it soon reaches a staable operating temperature. There's really no other energy changes in the computer, so all 300 watts of power coming in must equal the power going out.
[2]
Sam Altman fires back at Elon Musk's proposal for space-based data centers, says orbiting data centers 'ridiculous' for now -- cites high failure rates and cost as primary limiters
Numerous visionaries, including Elon Musk and Jeff Bezos, are thinking about putting AI data centers into orbit to tap into unlimited amounts of power, fewer physical constraints, and the lack of regulations. But while the idea may be worthy in the long-term future, it is "ridiculous" for now, believes Sam Altman, chief executive and co-founder of OpenAI. "I honestly think the idea with the current landscape of putting data centers in space is ridiculous," said Sam Altman at a press conference hosted by The Indian Express. "It will make sense someday, but if you just do the very rough math of launch costs relative to the cost of power we can do on Earth, to say nothing of how you are going to fix a broken GPU in space, and they do break a lot still, unfortunately. We are not there yet." Indeed, it costs $5.6 million to launch 1,764 pound (800kg) into low Earth orbit (LEO) using a SpaceX rocket, though for those who plan to launch tens of tons, the price per kilogram will probably come down. Still, one of Nvidia's NVL72 GB200 rack-scale solutions weighs from 3,000 to 3,245 pounds (1,360 to 1,472kg), depending on the exact configuration, without data center-scale connectivity, cooling, and power infrastructure. Even with discounts, launching data centers into space is still extremely expensive today, so it is unclear whether it can make economic sense any time soon. "There will come a time -- space is great for a lot of things," Altman added. "Orbital data centers are not something that is going to matter at scale this decade." Radiation-hardened components required While launching hardware into space is expensive, there must first be hardware to launch. Leading-edge process technologies -- such as TSMC's N4 (4nm-class) -- used to build leading-edge AI accelerators like Nvidia's B200/B300, advanced CPUs, sophisticated DPUs, and network processors are not radiation-hardened, meaning that they cannot survive in space. Yet radiation-hardened fabrication technologies tend to be rather outdated (think 90nm), so before space-worthy computational hardware emerges, new process technologies must be developed. In addition to space-worthy microelectronics, the industry must also develop space-worthy cooling systems and power generation technologies that are capable of powering millions of AI accelerators. Companies like Elon Musk's SpaceX and Jeff Bezos's Blue Origin are probably closer to developing such infrastructure than traditional companies specializing in terrestrial data centers, which is perhaps why Musk and Bezos are so vocal about orbital data centers today, even if they are not going to become viable for at least a decade from now. Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
[3]
Orbital AI data centers could work, but they might ruin Earth in the process
At the start of the month, Elon Musk announced that two of his companies -- SpaceX and xAI -- were merging, and would jointly launch a constellation of 1 million satellites to operate as orbital data centers. Musk's reputation might suggest otherwise, but according to experts, such a plan isn't a complete fantasy. However, if executed at the scale suggested, some of them believe it would have devastating effects on the environment and the sustainability of low Earth Earth orbit. Musk and others argue that putting data centers in space is practical given how much more efficient solar panels are away from Earth's atmosphere. In space, there are no clouds or weather events to obscure the sun, and in the correct orbit, solar panels can collect sunlight through much of the day. In combination with declining rocket launch costs and the price of powering AI data centers on Earth, Musk has said that within three years space will be the cheapest way to generate AI compute power. Ahead of the billionaire's announcement, SpaceX filed an eight-page application with the Federal Communications Commission detailing his plan. The company hopes to deposit the satellites in this massive cluster in altitudes ranging between 500km and 2000km. They would communicate with one another and SpaceX's Starlink constellation using laser "optical links." Those Starlink satellites would then transmit inference requests to and from Earth. To power the entire effort, SpaceX has proposed putting the new constellation in sun-synchronous orbit, meaning the spacecraft would fly along the dividing line that separates the day and night sides of the planet. Almost immediately the plan was greeted with skepticism. How would SpaceX, for instance, cool millions of GPUs in space? At first glance, that might seem like a weird point to get hung up on -- much of space being around -450 Fahrenheit -- but the reality is more complicated. In the near vacuum of space, the only way to dissipate heat is to slowly radiate it out, and in direct sunlight, objects can easily overheat. As one commenter on Hacker News succinctly put it, "a satellite is, if nothing else, a fantastic thermos." Scott Manley, who, before he created one of the most popular space-focused channels on YouTube, was a software engineer and studied computational physics and astronomy, argues SpaceX has already solved that problem at a smaller scale with Starlink. He points to the company's latest V3 model, which has about 30 square meters of solar panels. "They have a bunch of electronics in the middle, which are taking that power and doing stuff with it. Now, some of that power is being beamed away as radio waves, but there's a lot of thermal power that's being generated and then having to be dissipated. So they already have a platform that's running electronics off of power, and so it's not a massive leap to turn into something doing compute." Kevin Hicks, a former NASA systems engineer who worked on the Curiosity rover mission, is more skeptical. "Satellites with the primary goal of processing large amounts of compute requests would generate more heat than pretty much any other type of satellite," he said. "Cooling them is another aspect of the design which is theoretically possible but would require a ton of extra work and complexity, and I have doubts about the durability of such a cooling system." What about radiation then? There's a reason NASA relies on ancient hardware like the PowerPC 750 CPU found inside the Perseverance rover: Older chips feature larger transistors, making them more resilient to bit flips -- errors in processing caused most often by cosmic radiation -- that might scramble a computation. "Binary ones and zeroes are about the presence or absence of electrons, and the amount of charge required to represent a 'one' goes down as the transistors get smaller and smaller," explains Benjamin Lee, professor of computer and information science at the University of Pennsylvania. Space is full of energized particles traveling at incredible velocities, and the latest GPUs are built on the smallest, most advanced processing nodes to create transistor-dense silicon. Not a great combination. "My concern about radiation is that we don't know how many bit flips will occur when you deploy the most advanced chips and hundreds of gigabytes of memory up there," said Professor Lee, pointing to preliminary research by Google on the subject. As part of Project Suncatcher, its own effort to explore the viability of space-based data centers, the company put one of its Trillium TPUs in front of a proton beam to bombard it with radiation. It found the silicon was "surprisingly radiation-hard for space applications." While those results were promising, Professor Lee points out we just don't know how resilient GPUs are to radiation at this scale. "Even though modern computer architectures can detect and sometimes correct for those errors, having to do that again and again will slow down or add overhead to space-based computation," he said. Space engineer Andrew McCalip, who's done a deep dive on the economics of orbital data centers, is more optimistic, pointing to the natural resilience of AI models. "They don't require 100 percent perfect error-free runs. They're inherently very noisy, very stochastic," he explains, adding that part of the training for modern AI systems involves "injecting random noise into different layers." Even if SpaceX could harden its GPUs against radiation, the company would still lose satellites to GPUs that break down. If you know anything about data centers here on Earth, it's that they require constant maintenance. Components like SSDs and GPUs die all the time. Musk has claimed SpaceX's AI satellites would require "little" in the way of operating or maintenance costs. That's only true if you accept the narrowest possible interpretation of what maintaining a fleet of AI satellites would entail. "I think that there's no case in which repair makes sense. It's a fly till you die scenario," says McCalip. From an economic perspective, McCalip argues the projected death rate of GPUs in space represents "one of the biggest uncertainties" of the orbital data center model. McCalip's put that number at nine percent on the basis of a study Meta published following the release of its Llama 3 model (which, incidentally, measured hardware failures on Earth.) But the reality is no one knows what the attrition rate of those chips will be until they're in space. Orbital data centers also likely wouldn't be a direct replacement for their terrestrial counterparts. SpaceX's application specifically mentions inference as the primary use case for its new constellation. Inference is the practical side of running an AI system. It sees a model apply its learning to data it hasn't seen before, like a prompt you write in ChatGPT, to make predictions and generate content. In other words, AI models would still need to be trained on Earth, and it's not clear that the process could be offloaded to a constellation of satellites. "My initial thinking is that computations that require a lot of coordination, like AI training, may end up being tricky to get right at scale up there," says Professor Lee. In 1978, a pair of NASA scientists proposed a scenario where low Earth orbit could become so dense with space junk that collisions between those objects would begin to cascade. That scenario is known as Kessler syndrome. One estimate from satellite tracking website Orbiting Now puts the number of objects in orbit around the planet at approximately 15,600. Another estimate from NASA suggests there are 45,000 human-made objects orbiting Earth. No matter the number, what's currently in space represents a fraction of the 1 million additional satellites Musk wants to launch. According to Aaron Boley, professor of physics and astronomy at the University of British Columbia and co-director of the Outer Space Institute, forward-looking modeling of Earth's orbit above 700 kilometers -- where part of SpaceX's proposed cluster would live -- suggests that area of space is already showing signs of Kessler syndrome. While it takes less time for debris to clear in low Earth orbit, Professor Boley says there's already enough material in that region of space where there could be a cascading effect from a major collision. Debris could, in a worst case scenario, take a decade to clear up. In turn, that could lead to disruptions in global communications, climate monitoring missions and more. "You could get to the point where you're just launching material in, and you could ask yourself how many satellites can I afford to lose? Can you reconstitute your constellation faster than you're losing parts of it because of debris?" says Boley. "That's a horrible future in terms of the environmental perspective" In particular, it would limit opportunities for humans to fly into low Earth orbit. "Could you operate in it? Yeah, but it would come with higher and higher costs," adds Boley. "The entire world is struggling with the problem of how we safely fly multiple mega constellations," says Richard DalBello, who previously ran the Traffic Coordination System for Space (TraCSS) at the US Department of Commerce. Right now, there is no common global space situational awareness (SSA) system, and government and satellite operators are using uncoordinated national and commercial systems that are likely producing different results. At the start of the year, SpaceX lowered the orbit of thousands of Starlink satellites after one of them nearly collided with a Chinese satellite. SpaceX has its own in-house SSA system called Stargaze, which it uses to fly its more than 7,000 Starlink satellites. According to DalBello, competing operators can receive SSA data from SpaceX, but to do so they must share their satellite position information. "Assuming data sharing, it is likely Stargaze can make an important contribution to spaceflight safety" says DalBello. "SpaceX is likely to have success with US and other commercial operators, but without the assistance of the federal government, other governments -- particularly China -- will likely be unwilling to share their satellite and SSA data." According to DalBello, the Biden administration was unable to make meaningful progress on the next-generation TraCSS system, in part because Congress was initially reluctant to fund the program. Meanwhile, the current Trump administration hasn't shown interest in advancing the work that began during the president's first term. Even if the regulatory situation suddenly changes and the world's governments agree on an international SSA system, SpaceX launching 1 million satellites along the day-night terminator would see the company effectively monopolize one of the Earth's most valuable and important orbits. Professor Boley argues we should view our planet's orbits as a resource that belongs to everyone. "Every time you put a satellite up, you use part of that resource. Now someone else can't use it." And as Hicks points out, even a single cascade of colliding satellites would prevent that space from being used for scientific endeavors. "You would have to wait years for that debris to slowly come back into the atmosphere and burn up. In the meantime, that debris is taking up space that could be used for climate monitoring missions or any other types of missions that governments want to launch." Separately, the constant churn of Starship launches and re-entry of dead satellites would have a potentially dire impact on our planet's atmosphere. "We're not prepared for it," Boley flatly says of the latter. "We're not prepared for what's happening now, and what's happening now is already potentially bad." According to Musk's "basic math," SpaceX could add 100 gigawatts of AI compute capacity annually by launching a million tons of satellite per year. McCalip estimates a 100-gigawatt buildout alone would necessitate about 25,000 Starship flights. Many of the metals found in satellites, including aluminum, magnesium and lithium, in combination with the exhaust rockets release into the atmosphere, can have complicated effects on the health of the planet. For instance, they can affect polar cloud formations, which in turn can facilitate ozone layer destruction through the chemical reactions that occur on their surfaces. According to Boley, the problem is we just don't know how severe those environmental factors could become at the scale Musk has proposed, and SpaceX has provided us with precious few details on its mitigation plans. All it has said is that its plan would "achieve transformative cost and energy efficiency while significantly reducing the environmental impact associated with terrestrial data centers." Even if SpaceX could and does go out its way to mitigate the atmospheric effects of constant rocket flights, those spacecraft still need to be manufactured here on Earth. At one of his previous roles, Hicks studied rocket emissions and found the supply chains needed to build them produce an "order of magnitude" more carbon emissions than the rockets themselves. SpaceX plans to fly its new satellites in a sun-synchronous orbit, meaning for much of the year, they'll be sunlit. Each new Starlink generation has been larger and heavier than the one before it, with SpaceX stating in a recent filing that its upcoming V3 model could weigh up to 2,000 kilograms, up from the 575 kilograms of the V2 Mini Optimized. While we don't know the exact dimensions of the company's still-hypothetical AI satellites, they will almost certainly be bigger than their Starlink counterparts. SpaceX has done more than most space operators to reduce the brightness of its satellites, but Professor Boley says he expects that this new constellation will be "strikingly bright" when moving through the night sky. In aggregate, he estimates they will almost certainly be harmful to scientific research here on Earth, limiting what terrestrial observatories can see. "You're going to see them with the naked eye. You're going to see them with cameras. It's going to be like living near an airport where you see all these things flying over just after sunset and the next couple of hours after sunset," says Manley. "I don't know if I want to have my entire sunset be just a band of satellites constantly shooting overhead." There are good reasons to make some spacecraft capable of doing AI inference. For instance, Professor Lee suggests it would make orbital imaging satellites more useful, as those spacecraft could do on-site analysis, instead of sending high-resolution files over long distances, saving time in the process. But the dose, as they say, makes the poison. "There's a lot of excitement about the many possibilities that can be brought to society and humanity through continued access to space, but the promise of prosperity is not permission to be reckless," he says. "At this moment, we're allowing that excitement to overtake that more measured progression [...] those impacts don't just impact outer space but Earth as well."
[4]
Heated Rivals Musk and Altman Disagree on One More Thing: Data Centers in Space
As SpaceX founder Elon Musk begins laying the groundwork for a Starlink-powered data center in orbit, his AI billionaire nemesis expressed skepticism over the need to establish orbital data centers anytime soon. During a recent interview, OpenAI CEO Sam Altman shut down the idea of launching orbital data centers. "I honestly think the idea with the current landscape of putting data centers in space is ridiculous," Altman is quoted in Business Insider as saying. There are more than 5,000 AI data centers currently in the United States, with that number expected to multiply in the coming years. Tech billionaires like Musk are now looking to the skies as the next frontier for housing these energy-intensive facilities. In late January, SpaceX filed an FCC application to launch an orbital data center constellation of up to one million Starlinks. The move aligns with Musk's plans for a SpaceX IPO later this year, with earlier reports suggesting the decision is in part to raise capital to support an orbital data center venture. Altman himself has toyed with the idea of launching data centers into space. "I do guess a lot of the world gets covered in data centers over time," Altman said during a podcast interview in 2025, Wired reported. "Maybe we put [data centers] in space." There were also reports late last year of Altman potentially investing billions into Stoke Space, a Seattle-based startup that’s developing a reusable rocket, to gain a controlling stake in the company. Although no deal was made, Altman reportedly intended on either buying or partnering with a rocket company so that he would be able to deploy AI data centers to space. In his recent appearance, however, Altman seems to dismiss the idea of launching data centers into space within the next 10 years, according to Business Insider. He added that the idea of orbital data centers could make sense one day, but the cost of launching them to space remains an obstacle. Another challenge is figuring out how to fix a computer chip in space, according to Altman. "We are not there yet. There will come a time," Altman is quoted as saying. "Space is great for a lot of things. Orbital data centers are not something that's going to matter at scale this decade." It's not clear whether the change of heart is marginally fueled by Musk's plans to launch orbital data centers or if a failed deal with a rocket startup inspired Altman to reconsider the idea. Altman had also previously suggested building a Dyson sphere to harness energy from the Sun and feed the growing demand of AI data centers. So, clearly his thoughts are still a work in progress.
[5]
OpenAI chief dismiss orbital data centers this decade despite growing hype
* Sam Altman describes current proposals for orbiting data centers as entirely unrealistic for this decade * Modern AI chips cannot survive space radiation, making orbital data centers currently unfeasible * Radiation-hardened semiconductor nodes lag behind advanced fabrication processes required for AI workloads Sam Altman has publicly dismissed proposals to place large-scale data centers in orbit, describing the idea as unrealistic under current technological and economic conditions. The OpenAI chief executive argued space-based computing infrastructure will not operate at a meaningful scale within this decade. His comments come as the likes of Elon Musk and Jeff Bezos have spoken about the long-term potential of orbital facilities powered by abundant solar energy and freed from terrestrial constraints. Hardware not built for space Altman's remarks directly challenge that optimism and draw attention to the practical limitations facing such projects. "I honestly think the idea with the current landscape of putting data centers in space is ridiculous," said Sam Altman at a press conference hosted by The Indian Express. "It will make sense someday, but if you just do the very rough math of launch costs relative to the cost of power we can produce on Earth, not to mention how you are going to fix a broken GPU in space, and they still break a lot, unfortunately, we are not there yet." Modern AI accelerators and high-performance processors are manufactured using advanced fabrication nodes such as 4nm-class process technologies. These cutting-edge chips are not radiation-hardened and therefore cannot withstand the harsh conditions of space. Radiation-resistant semiconductor technologies do exist, although they rely on much older manufacturing nodes that lack the performance required for today's large AI workloads. Before orbiting facilities can handle meaningful computational demand, new fabrication approaches would need to combine advanced performance with radiation tolerance. Beyond processing hardware, orbital data centers would require cooling systems and reliable power generation capable of sustaining millions of accelerators. Launch providers such as SpaceX and Blue Origin are developing reusable rockets and space infrastructure, yet the supporting ecosystem for operating massive computing facilities in orbit remains incomplete. Terrestrial data centers already depend on complex arrangements involving power grids, cooling systems, SSD arrays, HDD backups, and cloud storage integration, all of which would require adaptation for space environments. Cost remains a central barrier to orbital deployment. Launching 800kg into low Earth orbit can cost several million dollars using current commercial rockets. A single Nvidia NVL72 GB200 rack-scale solution weighs well over a metric ton without additional cooling or connectivity systems. Scaling such infrastructure into orbit would multiply launch requirements and associated expenses. Even if launch prices decline for larger payloads, the cumulative cost of transporting and assembling full-scale facilities would remain high under current conditions. Altman has acknowledged that space will eventually support certain industries, although he maintains that orbiting data centers do not appear viable at scale this decade. Via Tom's Hardware Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button! And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.
[6]
AI data centers in space are having a moment. Experts say: Not so fast | Fortune
Even as technology companies are projected to spend more than $5 trillion globally on earth-based data centers by the end of the decade, Elon Musk is arguing the future of AI computing power lies in space -- powered by solar energy -- and that the economics and engineering to make it work could align within a few years. Over the past three weeks, SpaceX has filed plans with the Federal Communications Commission for what amounts to a million-satellite data-center network. Musk has also said he plans to merge his AI startup, xAI, with SpaceX to pursue orbital data centers. And at an all-hands meeting last week, he told xAI employees the company would ultimately need a factory on the moon to build AI satellites -- along with a massive catapult to launch them into space. "The lowest-cost place to put AI will be in space, and that will be true within two years, maybe three at the latest," Musk said at the World Economic Forum meeting in Davos this January. Musk is not alone in floating the idea. Alphabet CEO Sundar Pichai has said Google is exploring "moonshot" concepts for data centers in space later this decade. Former Google CEO Eric Schmidt has warned that the industry is "running out of electricity" and has discussed space-based infrastructure as a potential long-term solution. And Amazon and Blue Origin founder Jeff Bezos has said orbital data centers could become the next step in space ventures designed to benefit earth. Still, while Musk and some other bulls argue that space-based AI could become cost-effective within a few years, many experts say anything approaching meaningful scale remains decades away -- especially as the bulk of AI investment continues to flow into terrestrial infrastructure. That includes Musk's own Colossus supercomputer in Memphis, which analysts estimate will cost tens of billions of dollars. They emphasize that while limited orbital computing is feasible, constraints around power generation, heat dissipation, launch logistics, and cost make space a poor substitute for earth-based data centers anytime soon. The renewed interest reflects mounting pressure on the industry to find ways around the physical limits of earth-based infrastructure, including strained power grids, rising electricity costs, and environmental concerns. Talk of orbital data centers has circulated for years, largely as a speculative or long-term concept; but now, experts say, there is additional urgency as the AI boom is increasingly dependent on ever more power to support the training and running of energy-hungry AI models. "A lot of smart people really believe that it won't be too many years before we can't generate enough power to satisfy what we're trying to develop with AI," said Jeff Thornburg, CEO of Portal Space Systems and a SpaceX veteran who led development of SpaceX's Raptor engine. "If that is indeed true, we have to find alternate sources of energy. That's why this has become so attractive to Elon and others." However, while the concept of data centers in space has moved beyond science fiction, it is unlikely to displace the massive AI facilities now being built on earth anytime soon. "This is something people are cynical about because it's just technologically not feasible at the moment," said Kathleen Curlee, a research analyst at Georgetown University's Center for Security and Emerging Technology who studies the U.S. space economy. "We're being told the timeline for this is 2030, 2035 -- and I really don't think that's possible." Thornburg agreed that the hurdles are formidable, even if the underlying physics are sound. "We know how to launch rockets; we know how to put spacecraft into orbit; and we know how to build solar arrays to generate power," he said. "And companies like SpaceX are showing we can mass-produce space vehicles at lower cost. With vehicles like Starship, you can carry a lot of equipment to orbit." As far as it being the right thing to try to move data centers off the ground to take advantage of the solar energy in orbit, he added, "it's a no-brainer." But feasibility, Thornburg cautioned, does not mean being able to build at speed or scale. "I think it's always a question of how long it will take," he said. The first -- and most fundamental -- challenge is power. Running AI data centers in orbit would require "ginormous" solar arrays that do not yet exist, Thornburg said. Today's AI chips, including Nvidia's most powerful GPUs, demand far more electricity than current solar-powered satellites can reliably provide. Boon Ooi, a professor at Rensselaer Polytechnic Institute who studies long-term semiconductor challenges, put the scale into stark perspective: Generating just one gigawatt of power in space would require roughly one square kilometer of solar panels. "That's extremely heavy and very expensive to launch," he said. While the cost of transporting materials to orbit has come down in recent years, it still costs thousands of dollars per kilogram, raising the question of how to lower costs so space-based data centers could compete economically with those on earth. Even in orbit, solar power is not constant. Satellites regularly pass through earth's shadow, and solar panels cannot always remain optimally aligned with the sun. At the same time, AI chips require steady, uninterrupted power, even as their demand spikes during intensive computation. As a result, orbital data centers would also need large onboard batteries to smooth out power fluctuations, said Josep Miquel Jornet, a professor of electrical and computer engineering at Northeastern University. So far, he noted, only one startup -- Lumen -- has successfully flown even a single Nvidia H100 GPU on a satellite. Cooling presents another unresolved challenge. While space itself is cold, the methods used to cool data centers on earth -- airflow, liquid cooling, and fans -- do not work in a vacuum. "There's nothing that can take heat away," Jornet said. "Researchers are still exploring ways to dissipate that heat." Other obstacles include space traffic jams and communication delays. With growing amounts of space debris in low earth orbit, managing and maneuvering large numbers of satellites would require autonomous collision-avoidance systems, Curlee said. And for many AI workloads, communicating with data centers via satellites would be slower and less energy-efficient than using fiber-connected facilities on the ground. "If you have data centers on earth, fiber connections will always be faster and more efficient than sending every prompt to orbit," Jornet said. The consensus among experts is that small pilot projects may emerge by the end of the decade -- but not anything approaching the scale of today's terrestrial data centers. "What you'll see between now and 2030 is design iteration," Thornburg said, pointing to work on solar arrays, heat rejection systems, and orbital positioning. "Will it be on schedule? No. Will it cost what we think it will? Probably not." Even SpaceX, he added, is still several years away from routinely flying its Starship launch vehicle at the cadence required to support such infrastructure. "They're in the lead, but they still have development to finish," he said. "I think it's a minimum of three to five years before you see something that's actually working properly, and you're beyond 2030 for mass production." Jornet echoed that view. "Two to three years is not realistic at the scale being promised," he said. "You might see three or four or five satellites that together look like a tiny data center. But that would be orders of magnitude smaller than what we build on earth." Still, Thornburg cautioned against dismissing the idea of orbital data centers outright. "You shouldn't bet against Elon," he said, pointing to SpaceX's long history of defying skepticism. In the long run, he added, the energy pressures driving interest in orbital data centers are unlikely to disappear. "Engineers will find ways to make this work," he said. "Long term, it's just a matter of how long is it going to take us."
[7]
Sam Altman defends AI's resource consumption and ridicules Musk's plan to put data centers in space - SiliconANGLE
Sam Altman defends AI's resource consumption and ridicules Musk's plan to put data centers in space OpenAI Group PBC Sam Altman was in India this week, where he fielded questions about the environmental impact of artificial intelligence at a special event hosted by The Indian Express. While conceding that AI's total energy consumption is a legitimate concern, Altman contended that ChatGPT doesn't really consume any more resources over its lifetime than the average human being. The executive also strongly refuted sensationalist claims around AI's water usage. When asked if it's accurate to say that a single ChatGPT query "consumes 17 gallons of water" and the equivalent of 1.5 iPhone battery charges to process a single query, he replied that such claims are "completely untrue, totally insane and have no connection to reality." According to Altman, AI's water consumption was once a legitimate worry with older data centers that use evaporative cooling systems. But modern data centers use much more efficient methods to cool their servers, and so the issue has evaporated. It's no longer a concern, he insisted. With regard to AI's energy consumption, Altman conceded that the total amount of power used globally is troublesome, but said it should really just encourage us to accelerate the shift to nuclear, wind and solar energy sources. Some of the AI industry's biggest critics like to make what Altman said is an unfair "apples-to-oranges comparison" about AI's energy usage, contrasting the massive amounts of electricity used to train models to the tiny amount of energy a human brain uses for inference tasks. "But it also takes a lot of energy to train a human," Altman contended. "It takes like 20 years of life and all of the food you can eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion-odd people that have ever lived to build our cumulative knowledge in survival, science, mathematics and more." Altman said a fairer comparison would be to look at the total energy used to train an AI model and respond to one question, and contrast this with the lifetime of energy used by humans to get them to where they need to be to perform the same task. "Most probably, AI has already caught up on an energy efficiency basis," he insisted. Earlier in the discussion, Altman was asked about his thoughts on Elon Musk's ambitions to send data centers into low-Earth orbit. Earlier this month, Musk cited orbital data centers as one of the main reasons for merging his rocket company SpaceX Corp. with his AI firm xAI Corp. Google LLC has also explored the concept. However, Altman dismissed the idea as "ridiculous" at the present time. He said the launch costs involved in getting a data center into orbit are likely to be extremely prohibitive compared to terrestrial power generation. He also cited the near-impossible challenge of fixing things like broken processors or storage arrays. "If you just do the rough math of launch costs relative to the cost of power we can do on Earth, we are not there yet," he said. Altman did say that space could one day be a useful environment for certain AI applications, but insisted that "orbital data centers are not something that's going to matter at scale this decade."
[8]
Data centres in space a 'ridiculous' idea for now: Sam Altman
Altman, speaking to The Indian Express, said that while space will play a critical role in the future, "orbital data centres are not something that's going to matter at scale this decade." He pointed out the high launch costs and difficulty of maintaining cloud infrastructure in orbit: "If you just do the rough math of launch costs relative to the cost of power we can do on Earth, just say nothing of how you're gonna fix a broken GPU in space, we are not there yet." OpenAI CEO Sam Altman termed the idea of putting data centres in space "ridiculous" for now, saying that the economics and logistics just doesn't make sense yet. This comment comes just weeks after Elon Musk announced a merger of SpaceX and xAI to build orbital computing facilities for his artificial intelligence (AI) models. Altman, speaking to The Indian Express, said that while space will play a critical role in the future, "orbital data centres are not something that's going to matter at scale this decade." He pointed out the high launch costs and difficulty of maintaining cloud infrastructure in orbit: "If you just do the rough math of launch costs relative to the cost of power we can do on Earth, just say nothing of how you're gonna fix a broken GPU in space, we are not there yet." Musk, on the other hand, has said that space-based data centres could soon offer advantages in cooling, energy efficiency, and latency, suggesting the first prototypes could arrive in the near future. His move to tie SpaceX's launch capabilities with xAI's training needs highlighted his confidence in his own theory that space is the answer for the environmental issues coming up in developing AI. Also Read: What's happening in India with AI is really amazing: Sam Altman lauds speed of tech adoption Altman also reflected on the global power balance in AI and robotics, acknowledging that China is "clearly ahead" in areas like manufacturing physical robots, electric motors, and magnets, while the West maintains strengths in advanced software and model design. "It's hard to be ahead on everything or behind on everything," he said, adding that global competition in AI remains both collaborative and adversarial. Looking beyond competition, Altman said the next phase of AI's evolution will require governments and companies to co-create the infrastructure that underpins it. "Given the level of impact this is going to have on society, and the need to truly democratise this technology, governments are going to have to be involved," he said. He noted that the technology sector's relationship with government has shifted over the decades, from libertarian independence to deeper public-private interdependence. Google's edge Altman also praised Google for its renewed momentum in AI, saying the company saw a strong comeback after lagging behind OpenAI following ChatGPT's launch. "Ten years ago, Google was the only serious AI effort. Maybe three years ago, when we launched ChatGPT, they were way out of it," he said. "But I admire how Demis (Hassabis) and the Google team started working on AI before anyone else in the modern era, with a lot of conviction. Without their inspiration, we certainly wouldn't be here." He added that Google's recent "relentless focus and execution" and ability to scale its models have been "quite impressive."
[9]
Elon Musk Is Betting Big On Data Centers In Space, But OpenAI's Sam Altman Just Gave His 'Ridiculous' Idea A Reality Check: 'How Hard It Is To...'
Elon Musk's ambitious plan to deploy massive data centers in orbit is drawing sharp criticism from OpenAI CEO Sam Altman, who dismissed the idea as "ridiculous" -- at least for now -- citing steep costs and logistical nightmares. Musk's Million-Unit Ambition Each data center would be 31 miles long and operate more than 310 miles above the Earth's surface, according to SpaceX's application filed with the Federal Communications Commission. The pitch: orbital facilities could tap uninterrupted solar power and bypass Earth's grid constraints, a growing concern as AI workloads surge and energy demand from terrestrial data centers spikes. How To Fix A Broken GPU In Space? But speaking in remarks reported by The Indian Express on Saturday, Altman questioned the economic and practical viability of the plan. He pointed to launch expenses that far exceed current power costs on Earth and raised a more fundamental issue -- maintenance. "How hard it is to fix a broken GPU in space," Altman said, underscoring the complexity of repairing or upgrading hardware in orbit. Meanwhile, OpenAI is reportedly tempering some of its most aggressive infrastructure plans, telling investors it now expects to spend roughly $600 billion on computing power through 2030, compared to the $1.4 trillion projected earlier. Image via Shutterstock Market News and Data brought to you by Benzinga APIs To add Benzinga News as your preferred source on Google, click here.
[10]
Altman calls Musk's space data center plans 'ridiculous' for current...
OpenAI CEO Sam Altman dismissed the idea of data centers in space being a viable option in the next few years as SpaceX CEO Elon Musk pursues their deployment. "I honestly think the idea with the current landscape of putting data centers in space is ridiculous," Altman said in an interview with Indian Express. "It will make sense someday." Altman said that space-based artificial intelligence (AI) data center projects would have to deal with high launch costs as well as operational and maintenance challenges, like how to fix a broken or damaged component while the data center is in orbit. "We are not there yet. There will come a time. Orbital data centers are not something that's going to matter at scale this decade," Altman said in the interview. SpaceX's Musk said earlier this month at an event announcing SpaceX's acquisition of xAI that the energy demands of AI will require moving data centers to space because of the strain it puts on the environment. "In the long term, space-based AI is obviously the only way to scale," Musk said. "My estimate is that within 2 to 3 years, the lowest cost way to generate AI compute will be in space." SpaceX's merger with xAI, the AI company Musk founded that went on to acquire the X social media platform, aims to create a more than $1 trillion company ahead of a planned initial public offering that will enable them raise capital and speed up plans to deploy data centers in space. SpaceX recently filed a document with the Federal Communications Commission requesting to launch up to 1 million satellites that would function as data centers in Earth's orbit. Musk said in a memo outlining his plans that SpaceX aims to put a million tons of satellites into orbit per year with 100 kilowatts of compute power per ton, adding 100 gigawatts of AI computing capacity per year. Other tech companies pursuing space-based data centers include Google, as CEO Sundar Pichai told "Fox News Sunday" that the company could put solar-powered data centers in space as soon as next year as part of what's known as Project Suncatcher. Amazon Web Services CEO Matt Garman said at the Cisco AI Summit earlier this month, "there are not enough rockets to launch a million satellites yet, so we're, like, pretty far from that." "If you think about the cost of getting a payload in space today, it's massive," Garman added. "It is just not economical."
[11]
Space Data Centers: Future Reality or Sci-Fi Dream?
Space data centres, once limited to science fiction, are now being tested by private companies, space agencies, and even startups. The question is no longer whether they are possible, but whether they are practical. The AI boom has turned data centres into energy-intensive industrial complexes. Training a single frontier model can consume as much electricity as a small town. Tech companies are scrambling to secure clean, continuous power, often facing public backlash over water use for cooling and the strain on local grids. Space offers a compelling alternative. Satellites in low-Earth orbit receive near-constant solar energy without atmospheric losses. The vacuum of space allows heat to dissipate through radiation, reducing the need for water-based cooling systems. In theory, this creates a self-sustaining, highly-efficient powered almost entirely by solar energy. Earth observation satellites, communications constellations, and defence platforms generate massive amounts of data that must be transmitted to ground locations for processing. Orbital data centres could enable real-time data analysis, helping reduce latency, bandwidth costs, and the need for ground-based systems.
Share
Share
Copy Link
OpenAI's Sam Altman has publicly rejected proposals for space-based AI data centers, calling them ridiculous given current technology and economics. His comments directly counter Elon Musk's ambitious SpaceX plans to launch up to one million satellites as orbital data centers. While Musk argues space offers unlimited solar power and fewer constraints, Altman points to prohibitive launch costs, frequent GPU failures, and the challenge of repairs in orbit as critical barriers that won't be overcome this decade.
Sam Altman has thrown cold water on the growing enthusiasm for orbital data centers, describing current proposals as entirely unrealistic for meaningful deployment within this decade
2
. The OpenAI chief executive's remarks directly challenge Elon Musk's ambitious vision, creating a new front in the ongoing rivalry between these tech titans. Speaking at a press conference hosted by The Indian Express, Altman stated: "I honestly think the idea with the current landscape of putting data centers in space is ridiculous"2
. His skepticism focuses on practical barriers that proponents have yet to address adequately.
Source: New York Post
The economics of transporting AI data centers to space remain prohibitive under current conditions. It costs $5.6 million to launch just 1,764 pounds into low Earth orbit using a SpaceX rocket
2
. A single Nvidia NVL72 GB200 rack-scale solution weighs between 3,000 and 3,245 pounds, without accounting for data center-scale connectivity, cooling, and power infrastructure2
. Even with volume discounts for those planning to launch tens of tons, the cumulative expense of building orbital facilities at scale remains staggering. Altman emphasized that "if you just do the very rough math of launch costs relative to the cost of power we can do on Earth," the numbers simply don't work yet5
.Despite Altman's objections, Elon Musk continues advancing his vision for space-based infrastructure. In late January, SpaceX filed an eight-page application with the Federal Communications Commission detailing plans to launch a constellation of up to one million satellites to operate as orbital data centers
3
. The satellites would be deposited at altitudes ranging between 500km and 2000km, communicating with one another and SpaceX's Starlink constellation using laser optical links3
. Those Starlink satellites would then transmit inference requests to and from Earth. Musk argues that within three years, space will be the cheapest way to generate AI compute power, driven by the efficiency of solar panels away from Earth's atmosphere and declining rocket launch costs3
.
Source: Tom's Hardware
Altman highlighted a critical operational challenge that space enthusiasts often overlook: hardware reliability. "To say nothing of how you are going to fix a broken GPU in space, and they do break a lot still, unfortunately," he noted
2
. Modern GPUs experience frequent failures even in controlled terrestrial environments with sophisticated monitoring and maintenance systems. In orbit, physical repairs become essentially impossible, meaning any hardware failure results in permanent loss of that computational capacity. This reliability gap poses fundamental questions about the economic viability of orbital facilities, where every component must be launched at enormous expense yet cannot be serviced once deployed.The harsh radiation environment of space presents another formidable technical barrier. Leading-edge process technologies such as TSMC's N4 (4nm-class) used to build leading-edge AI accelerators like Nvidia's B200/B300, advanced CPUs, sophisticated DPUs, and network processors are not radiation-hardened, meaning they cannot survive in space
2
. Radiation-hardened fabrication technologies tend to be outdated, typically around 90nm processes, lacking the performance required for today's AI workloads2
. Space is full of energized particles traveling at incredible velocities, causing bit flips that scramble computations3
. Before space-worthy computational hardware emerges, entirely new process technologies must be developed that combine radiation-hardened components with advanced performance characteristics.Related Stories
While space might seem ideal for cooling given its frigid temperatures, the reality proves far more complex. In the near vacuum of space, the only way to dissipate heat is to slowly radiate it out, and in direct sunlight, objects can easily overheat
3
. As one engineer noted, "a satellite is, if nothing else, a fantastic thermos"3
. Kevin Hicks, a former NASA systems engineer who worked on the Curiosity rover mission, expressed skepticism: "Satellites with the primary goal of processing large amounts of compute requests would generate more heat than pretty much any other type of satellite. Cooling them is another aspect of the design which is theoretically possible but would require a ton of extra work and complexity, and I have doubts about the durability of such a cooling system"3
.The push toward orbital solutions stems from mounting challenges facing terrestrial facilities. AI data centers are being built at a frantic pace worldwide, driven by the AI boom. By 2028, AI servers alone may use as much energy as 22 percent of US households, raising energy prices for everyone and requiring more power plants, which contributes to global warming
1
. High-density AI chips run so hot that air cooling isn't enough, forcing facilities to turn to water evaporation cooling. A large data center using this method consumes millions of gallons of water daily, draining local water supplies1
. More towns are pushing back on data center projects in their areas, creating a "not on my planet" dilemma1
. There are currently more than 5,000 AI data centers in the United States, with that number expected to multiply in coming years4
.
Source: TechRadar
Altman's current stance represents a shift from his earlier musings on the topic. During a 2025 podcast interview, he speculated: "I do guess a lot of the world gets covered in data centers over time. Maybe we put [data centers] in space"
4
. Reports emerged late last year of Altman potentially investing billions into Stoke Space, a Seattle-based startup developing a reusable rocket, to gain a controlling stake4
. Although no deal materialized, Altman reportedly intended to buy or partner with a rocket company to deploy AI data centers to space. His current dismissal of near-term viability suggests either the failed deal influenced his thinking or that deeper analysis revealed insurmountable obstacles for this decade. Altman acknowledged that "it will make sense someday" and "space is great for a lot of things," but maintains that "orbital data centers are not something that's going to matter at scale this decade"4
.Summarized by
Navi
06 Feb 2026•Technology

29 Jan 2026•Technology

29 Jan 2026•Technology

1
Technology

2
Technology

3
Business and Economy
