10 Sources
10 Sources
[1]
Space-Based Data Centers Could Power AI with Solar Energy -- at a Cost
Space-based computing offers easy access to solar power, but presents its own environmental challenges To hear Silicon Valley tell it, artificial intelligence is outgrowing the planet that gave birth to it. Data centers will account for nearly half of U.S. electricity demand growth between now and 2030, and their global power requirements could double by the end of this decade as companies train larger AI models. Local officials have begun to balk at approving new server farms that swallow land, strain power grids and gulp cooling water. Some tech executives now talk about putting servers in space as a way to escape those permitting fights. Orbital data centers could run on practically unlimited solar energy without interruption from cloudy skies or nighttime darkness. If it is getting harder to keep building bigger server farms on Earth, the idea goes, maybe the solution is to loft some of the most power-hungry computing into space. But such orbital data centers will not become cost-effective unless rocket launch costs decline substantially -- and independent experts warn they could end up with even bigger environmental and climate effects than their earthly counterparts. In early November Google announced Project Suncatcher, which aims to launch solar-powered satellite constellations carrying its specialty AI chips, with a demonstration mission planned for 2027. Around the same time, the start-up Starcloud celebrated the launch of a 60-kilogram satellite with an NVIDIA H100 GPU as a prelude to an orbital data center that is expected to require five gigawatts of electric power by 2035. If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today. Those two efforts are part of a broader wave of concepts that move some computing off-planet. China has begun launching spacecraft for a Xingshidai "space data center" constellation, and the European Union is studying similar ideas under a project known as ASCEND. "Orbital data centers would benefit from continuous solar energy, generated by arrays of photovoltaic cells," says Benjamin Lee, a computer architect and engineer at the University of Pennsylvania. "This could resolve long-standing challenges around powering data center computation in a carbon-efficient manner." Most proposals envision orbital data centers that would be in a dawn-to-dusk, sun-synchronous orbit aligned with the boundary between day and night on Earth so that their solar panels would receive almost constant sunlight and gain an efficiency advantage outside Earth's atmosphere. But the same physics that make orbital data centers appealing also impose new engineering headaches, Lee says. Their computing hardware must be protected from high radiation, through either shielding or error-correcting software. To cool off, orbital platforms need large radiators that can dump heat into the vacuum of space, adding significant mass that has to be launched on rockets. All these plans ultimately collide with one stubborn constraint: getting hardware into space. Rocket launch costs alone pose a significant challenge to building large orbital data centers, not to mention the need to replace onboard chips every five to six years. "Launch costs are dropping with reusable rockets, but we would still require a very large number of launches to build orbital data centers that are competitive with those on Earth," Lee says. Google's Suncatcher team estimates that liftoff costs would need to fall to under $200 per kilogram by 2035 for their vision to make sense. Even if they become economically feasible, orbital data centers may impose additional sustainability costs on the world. Starcloud estimates that a solar-powered space data center could achieve 10 times lower carbon emissions compared with a land-based data center powered by natural gas generators. But researchers at Saarland University in Germany, who published a paper entitled "Dirty Bits in Low-Earth Orbit," calculated that an orbital data center powered by solar energy could still create an order of magnitude greater emissions than a data center on Earth, taking into account the emissions from rocket launches and reentry of spacecraft components through the atmosphere. Most of those extra emissions come from burning rocket stages and hardware on reentry, says Andreas Schmidt, a Saarland University computer scientist and a co-author of the paper. The process forms pollutants that can further deplete Earth's protective ozone layer. Astronomers have their own worries. Johnston says the ideal sun-synchronous orbit would only make orbital data centers visible in the night sky at dawn or dusk. But Samantha Lawler, an astronomer at the University of Regina in Saskatchewan, notes that some observers rely on twilight to hunt for near-Earth asteroids, and she is wary of any orbital data center with a multikilometer solar-panel array. She also fears that such projects could worsen the growing problem of space junk, as more hardware is launched and more debris and fragments fall back through the atmosphere. "There's so much pollution from reentries already and pieces hitting the ground," she says. For now, orbital data centers are mostly an idea, a handful of small prototypes and a stack of ambitious slide decks. The basic physics of near-constant sunlight in orbit are real, and launch costs are moving in the needed direction. But the environmental, astronomical and regulatory questions are pressing. The world will have to decide whether sending hardware into space is a clever way to power AI -- or just a way to push its side effects out of sight.
[2]
Google CEO: Data Centers in Space Could Be the Norm in About a Decade
Running data centers in space might sound like science fiction, but Google's CEO is betting it'll become a practical way to handle AI's soaring energy demands, possibly within a "decade or so." "We want to put these data centers in space, closer to the Sun," allowing them to harness the solar energy in Earth's orbit, Sundar Pichai said in an interview with Fox News. "There's no doubt to me that in a decade or so...we'll be viewing it as a more normal way to build data centers." His statement signals that Google sees real promise in the concept. The tech giant debuted a prototype "research moonshot" called Project Suncatcher last month, and it plans on sending up a pair of test satellites in early 2027, each with custom AI server chips. From there, Google hopes to gradually scale up the technology, Pichai told Fox News. The concept might also spark a new space race, as Amazon founder Jeff Bezos and SpaceX CEO Elon Musk have also been discussing the possibility of operating data centers in Earth's orbit. In addition, a startup called Starcloud successfully sent up its own test satellite carrying an Nvidia H100 GPU into space. Starcloud CEO Philip Johnston told PCMag last month that the satellite is operational and has been undergoing the "commissioning" phase to verify that all functions are working properly. Still, a major challenge facing orbiting data centers is cooling the AI chips and protecting them from cosmic radiation. The vacuum of space means there's no air to carry the heat away. So, the AI chips will need a built-in cooling system that uses stored air or liquid. Starcloud says it's also developing a "lightweight deployable radiator design with a very large area -- by far the largest radiators deployed in space," with the goal of radiating the heat away toward deep space.
[3]
Google's proposed data center in orbit will face issues with space debris in an already crowded orbit
University of Michigan provides funding as a founding partner of The Conversation US. The rapid expansion of artificial intelligence and cloud services has led to a massive demand for computing power. The surge has strained data infrastructure, which requires lots of electricity to operate. A single, medium-sized data center here on Earth can consume enough electricity to power about 16,500 homes, with even larger facilities using as much as a small city. Over the past few years, tech leaders have increasingly advocated for space-based AI infrastructure as a way to address the power requirements of data centers. In space, sunshine - which solar panels can convert into electricity - is abundant and reliable. On Nov. 4, 2025, Google unveiled Project Suncatcher, a bold proposal to launch an 81-satellite constellation into low Earth orbit. It plans to use the constellation to harvest sunlight to power the next generation of AI data centers in space. So, instead of beaming power back to Earth, the constellation would beam data back to Earth. For example, if you asked a chatbot how to bake sourdough bread, instead of firing up a data center in Virginia to craft a response, your query would be beamed up to the constellation in space, processed by chips running purely on solar energy, and the recipe sent back down to your device. Doing so would mean leaving the substantial heat generated behind in the cold vacuum of space. As a technology entrepreneur, I applaud Google's ambitious plan. But as a space scientist, I predict that the company will soon have to reckon with a growing problem: space debris. The mathematics of disaster Space debris - the collection of defunct human-made objects in Earth's orbit - is already affecting space agencies, companies and astronauts. This debris includes large pieces, such as spent rocket stages and dead satellites, as well as tiny flecks of paint and other fragments from discontinued satellites. Space debris travels at hypersonic speeds of approximately 17,500 miles per hour (28,000 km/h) in low Earth orbit. At this speed, colliding with a piece of debris the size of a blueberry would feel like being hit by a falling anvil. Satellite breakups and anti-satellite tests have created an alarming amount of debris, a crisis now exacerbated by the rapid expansion of commercial constellations such as SpaceX's Starlink. The Starlink network has more than 7,500 satellites, which provide global high-speed internet. The U.S. Space Force actively tracks over 40,000 objects larger than a softball using ground-based radar and optical telescopes. However, this number represents less than 1% of the lethal objects in orbit. The majority are too small for these telescopes to reliably identify and track. In November 2025, three Chinese astronauts aboard the Tiangong space station were forced to delay their return to Earth because their capsule had been struck by a piece of space debris. Back in 2018, a similar incident on the International Space Station challenged relations between the United States and Russia, as Russian media speculated that a NASA astronaut may have deliberately sabotaged the station. The orbital shell Google's project targets - a Sun-synchronous orbit approximately 400 miles (650 kilometers) above Earth - is a prime location for uninterrupted solar energy. At this orbit, the spacecraft's solar arrays will always be in direct sunshine, where they can generate electricity to power the onboard AI payload. But for this reason, Sun-synchronous orbit is also the single most congested highway in low Earth orbit, and objects in this orbit are the most likely to collide with other satellites or debris. As new objects arrive and existing objects break apart, low Earth orbit could approach Kessler syndrome. In this theory, once the number of objects in low Earth orbit exceeds a critical threshold, collisions between objects generate a cascade of new debris. Eventually, this cascade of collisions could render certain orbits entirely unusable. Implications for Project Suncatcher Project Suncatcher proposes a cluster of satellites carrying large solar panels. They would fly with a radius of just one kilometer, each node spaced less than 200 meters apart. To put that in perspective, imagine a racetrack roughly the size of the Daytona International Speedway, where 81 cars race at 17,500 miles per hour - while separated by gaps about the distance you need to safely brake on the highway. This ultradense formation is necessary for the satellites to transmit data to each other. The constellation splits complex AI workloads across all its 81 units, enabling them to "think" and process data simultaneously as a single, massive, distributed brain. Google is partnering with a space company to launch two prototype satellites by early 2027 to validate the hardware. But in the vacuum of space, flying in formation is a constant battle against physics. While the atmosphere in low Earth orbit is incredibly thin, it is not empty. Sparse air particles create orbital drag on satellites - this force pushes against the spacecraft, slowing it down and forcing it to drop in altitude. Satellites with large surface areas have more issues with drag, as they can act like a sail catching the wind. To add to this complexity, streams of particles and magnetic fields from the Sun - known as space weather - can cause the density of air particles in low Earth orbit to fluctuate in unpredictable ways. These fluctuations directly affect orbital drag. When satellites are spaced less than 200 meters apart, the margin for error evaporates. A single impact could not only destroy one satellite but send it blasting into its neighbors, triggering a cascade that could wipe out the entire cluster and randomly scatter millions of new pieces of debris into an orbit that is already a minefield. The importance of active avoidance To prevent crashes and cascades, satellite companies could adopt a leave no trace standard, which means designing satellites that do not fragment, release debris or endanger their neighbors, and that can be safely removed from orbit. For a constellation as dense and intricate as Suncatcher, meeting this standard might require equipping the satellites with "reflexes" that autonomously detect and dance through a debris field. Suncatcher's current design doesn't include these active avoidance capabilities. In the first six months of 2025 alone, SpaceX's Starlink constellation performed a staggering 144,404 collision-avoidance maneuvers to dodge debris and other spacecraft. Similarly, Suncatcher would likely encounter debris larger than a grain of sand every five seconds. Today's object-tracking infrastructure is generally limited to debris larger than a softball, leaving millions of smaller debris pieces effectively invisible to satellite operators. Future constellations will need an onboard detection system that can actively spot these smaller threats and maneuver the satellite autonomously in real time. Equipping Suncatcher with active collision avoidance capabilities would be an engineering feat. Because of the tight spacing, the constellation would need to respond as a single entity. Satellites would need to reposition in concert, similar to a synchronized flock of birds. Each satellite would need to react to the slightest shift of its neighbor. Paying rent for the orbit Technological solutions, however, can go only so far. In September 2022, the Federal Communications Commission created a rule requiring satellite operators to remove their spacecraft from orbit within five years of the mission's completion. This typically involves a controlled de-orbit maneuver. Operators must now reserve enough fuel to fire the thrusters at the end of the mission to lower the satellite's altitude, until atmospheric drag takes over and the spacecraft burns up in the atmosphere. However, the rule does not address the debris already in space, nor any future debris, from accidents or mishaps. To tackle these issues, some policymakers have proposed a use-tax for space debris removal. A use-tax or orbital-use fee would charge satellite operators a levy based on the orbital stress their constellation imposes, much like larger or heavier vehicles paying greater fees to use public roads. These funds would finance active debris removal missions, which capture and remove the most dangerous pieces of junk. Avoiding collisions is a temporary technical fix, not a long-term solution to the space debris problem. As some companies look to space as a new home for data centers, and others continue to send satellite constellations into orbit, new policies and active debris removal programs can help keep low Earth orbit open for business.
[4]
Startup announces 'Galactic Brain' project to put AI data centers in orbit
California-based company Aetherflux wants to make space-based solar power a reality. (Image credit: Aetherflux) Aetherflux, a space-based solar technology company hoping to beam power down to Earth, is now throwing its hat into the orbital data center ring. The company's new "Galactic Brain" project aims to speed up artificial intelligence data center production processes that are hampered on Earth by energy requirements and construction timelines by launching a constellation of solar-powered satellites capable of the same computing power without the bulky infrastructure. "The race for artificial general intelligence is fundamentally a race for compute capacity, and by extension, energy," Aetherflux founder and CEO Baiju Bhatt, who also co-founded the financial services company Robinhood, said in a press release today (Dec. 9). The company hopes to launch the first node of its Galactic Brain constellation sometime in the first quarter of 2027. The announcement comes as AI computing power needs are rapidly increasing and major companies with skin already in the game such as OpenAI, Google and Amazon have begun seriously considering orbital solutions for their own computing needs. "Satellites with localized AI compute, where just the results are beamed back from low-latency, sun-synchronous orbit, will be the lowest cost way to generate AI bitstreams in <3 years," SpaceX CEO Elon Musk said about the topic in a Dec. 7 post on X. Since its founding in 2024, California-based Aetherflux has focused its efforts on space-based solar power, with the goal of "building an American power grid in space," and the company's new Galactic Brain initiative fits neatly into that vision. "Continuous solar power and advanced thermal systems remove the limits faced by Earth-based data centers," the Aetherflux release said, describing the company's planned power-beaming capabilities as "foundational" to its orbital data center initiative, "enabling energy collected in space to support not only compute in orbit but also power delivery on Earth." According to Aetherflux's design, many small satellites will transmit energy through infrared lasers to ground stations, where the power and data can be subsequently distributed. "We anticipate powerbeaming to be dramatically more reliable than current solar power generation on the ground," the release said.
[5]
Data centres in space: will 2027 really be the year AI goes to orbit?
Anglia Ruskin University (ARU) provides funding as a member of The Conversation UK. Google recently unveiled Project Suncatcher, a research "moonshot" aiming to build a data centre in space. The tech giant plans to use a constellation of solar-powered satellites which would run on its own TPU chips and transmit data to one another via lasers. Google's TPU chips (tensor processing units), which are specially designed for machine learning, are already powering Google's latest AI model, Gemini 3. Project Suncatcher will explore whether they can be adapted to survive radiation and temperature extremes and operate reliably in orbit. It aims to deploy two prototype satellites into low Earth orbit, some 400 miles above the Earth, in early 2027. Google's rivals are also exploring space-based computing. Elon Musk has said that SpaceX "will be doing data centres in space", suggesting that the next generation of Starlink satellites could be scaled up to host such processing. Several smaller firms, including a US startup called Starcloud, have also announced plans to launch satellites equipped with the GPU chips (graphics processing units) that are used in most AI systems. The logic of data centres in space is that they avoid many of the issues with their Earth-based equivalents, particularly around power and cooling. Space systems have a much lower environmental footprint and it's potentially easier to make them bigger. As Google CEO Sundar Pichai has said: "We will send tiny, tiny racks of machines and have them in satellites, test them out, and then start scaling from there ... There is no doubt to me that, a decade or so away, we will be viewing it as a more normal way to build data centres." Assuming Google does manage to launch a prototype in 2027, will it simply be a high-stakes technical experiment - or the dawning of a new era? The scale of the challenge I wrote an article for The Conversation at the start of 2025 laying out the challenges of putting data centres into space, in which I was cautious about them happening soon. Now, of course, Project Suncatcher represents a concrete programme rather than just an idea. This clarity, with a defined goal, launch date and hardware, marks a significant shift. The satellites' orbits will be "sun synchronous", meaning they'll always be flying over places at sunset or sunrise so that they can capture sunlight nearly continuously. According to Google, solar arrays in such orbits can generate significantly more energy per panel than typical installations on Earth because they avoid losing sunlight due to clouds and the atmosphere, as well as night times. The TPU tests will be fascinating. Whereas hardware designed for space normally requires to be heavily shielded against radiation and extreme temperatures, Google is using the same chips used in its Earth data centres. The company has already done laboratory tests exposing the chips to radiation from a proton beam that suggest they can tolerate almost three times the dose they'll receive in space. This is very promising, but maintaining a reliable performance for years, amidst solar storms, debris and temperature swings is a far harder test. Another challenge lies in thermal management. On Earth, servers are cooled with air or water. In space, there is no air and no straightforward way to dissipate heat. All heat must be removed through radiators, which often become among the largest and heaviest parts of a spacecraft. Nasa studies show that radiators can account for more than 40% of total power system mass at high power levels. Designing a compact system that can keep dense AI hardware within safe temperatures is one of the most difficult aspects of the Suncatcher concept. A space-based data centre must also replicate the high bandwidth, low latency network fabric of terrestrial data centres. If Google's proposed laser communication system (optical networking) is going to work at the multi-terabit capacity required, there are major engineering hurdles involved. These include maintaining the necessary alignment between fast-moving satellites and coping with orbital drift, where satellites move out of their intended orbit. The satellites will also have to sustain reliable ground links back on Earth and ovecome weather disruptions. If a space data-centre is to be viable for the long term, it will be vital that it avoids early failures. Maintenance is another unresolved issue. Terrestrial data centres rely on continual hardware servicing and upgrades. In orbit, repairs would require robotic servicing or additional missions, both of which are costly and complex. Then there is the uncertainty around economics. Space-based computing becomes viable only at scale, and only if launch costs fall significantly. Google's Project Suncatcher paper suggests that launch costs could drop below US$200 (£151) per kilogram by the mid 2030s, seven or eight times cheaper than today. That would put construction costs on par with some equivalent facilities on Earth. But if satellites require early replacement or if radiation shortens their lifespan, the numbers could look quite different. In short, a two-satellite test mission by 2027 sounds plausible. It could validate whether TPUs survive radiation and thermal stress, whether solar power is stable and whether the laser communication system performs as expected. However, even a successful demonstration would only be the first step. It would not show that large-scale orbital data centres are feasible. Full-scale systems would require solving all the challenges outlined above. If adoption occurs at all, it is likely to unfold over decades. For now, space-based computing remains what Google itself calls it, a moonshot: ambitious and technically demanding, but one that could reshape the future of AI infrastructure, not to mention our relationship with the cosmos around us.
[6]
Sundar Pichai says Google will deploy solar-powered data centers in space by 2027
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. Looking ahead: Google may not be taking as many audacious moonshots as it did in its early years, but it remains at the forefront of global technological transformation, shaping how people work, communicate, and live. The company is now planning to revolutionize cloud computing services by deploying solar-powered data centers in space by 2027. In an interview with Fox News over the weekend, Google CEO Sundar Pichai discussed the recently announced Project Suncatcher, which aims to find more efficient ways to power energy-hungry data centers by harnessing solar energy and potentially making them more sustainable than traditional facilities. The space-based data centers are expected to go online in a limited capacity in 2027, with Google planning to send "tiny racks of machines" into orbit on two prototype satellites through a tie-up with Planet. Pichai added that he expects extraterrestrial data centers to become fairly common within ten years, with companies building giant gigawatt-scale facilities in space to power the AI boom. Google announced Project Suncatcher last month, describing it as the best solution to the enormous power requirements of AI data centers. According to the company, the initiative is a "research moonshot" that will leverage solar energy to run satellite swarms powered by Tensor Processing Units (TPUs) that communicate with one another over laser links instead of fiber. A 2024 Lawrence Berkeley National Laboratory report found that US data centers already use more than 4% of the country's electricity - putting a massive strain on power grids, driving up prices, and fueling protests in rural communities. Alarmingly, that figure is expected to rise to 12% by 2028 amid the continuing AI boom. Google believes that scaling machine-learning compute in space will not only help meet AI data centers' insatiable demand for power, but also do so sustainably by substituting thermal power with solar. The company is highly optimistic about the project and has published a preprint paper outlining plans to launch an interconnected network of solar-powered satellites running its TPU-based AI chips. Google is not the only company bullish on data centers in space. Amazon founder Jeff Bezos recently claimed that space-based solar data centers will become common within the next 20 years and that they will also be more cost-effective than traditional data centers on Earth.
[7]
Google, Amazon, and xAI want to launch AI into space
Having AI overhead could enhance connectivity for everything from remote internet access to disaster response. In the span of just a few months, the push to put artificial intelligence in space has progressed from a long-term dream to an immediate, very real strategic priority. Google's Project Suncatcher, Amazon's Leo project for advancing satellite internet constellation, and Elon Musk's xAI exploration into space-based compute environments all point toward the same thing: the next great leap for AI might not happen on land, but in low Earth orbit. Outrageous as it may seem, there's a lot of real engineering underneath the glossy press releases and visionary quotes. The efforts are spurred on by the very real infrastructure crunch faced by AI developers as models expand and demand skyrockets. It's intense enough that the data centers, fiber networks, and power grids supporting the world's digital spine are starting to show strain. New energy sources struggle to keep up. And that's before factoring in reasons like latency, climate risks, and political barriers as motivation. Google's play, Project Suncatcher, is aiming to build orbital compute nodes powered by near-constant solar exposure and cooled by the vacuum of space. The idea is that these sun-drenched satellites full of Google's Tensor Processing Units could eventually run machine learning models more efficiently than ground-based data centers, especially for tasks that don't require real-time human interaction. Solar panels work better in orbit. Cooling is easier. And there's no storm or blackout to knock them offline. With Amazon Leo, the company is building out a global broadband network of thousands of low Earth orbit satellites that will eventually link to cloud and AI infrastructure. Some of those satellites may one day support edge computing for AI tasks in places with limited or no access to the cloud. Meanwhile, Elon Musk is sketching out concepts for orbital compute farms for xAI and SpaceX to tackle. They wouldn't just run models, but train them. That's a much harder technical challenge, but one that might make sense for resource-intensive tasks that benefit from uninterrupted energy and physical isolation. If you're trying to train a multi-trillion parameter model without bumping into terrestrial bandwidth caps or infrastructure bottlenecks, space starts looking pretty good. These projects could make a huge difference for a lot of people. Rural school systems could access fast cloud tools, and weather monitoring systems could extrapolate using real-time orbital AI to predict flash floods and reroute aid. And with solar-powered nodes running in space, companies might rely less on carbon-heavy terrestrial grids. Space-based energy providers have been discussed since before there was a space program. It might be that AI demand is the tipping point to invest in such a project. Of course, space is far from forgiving or cheap to operate from. Launching hardware is expensive, and radiation-shielding is hard. Coordination of thousands of satellites can cause orbital traffic jams. There's also the question of who owns the infrastructure, who gets to use it, and whether it becomes yet another layer of centralized control in the tech ecosystem. Governments, naturally, are watching closely. From a user perspective, though, the shift may be mostly invisible at first. You won't log into a 'space version' of your favorite app, but you might notice things loading faster, and you might start seeing services in previously unconnected parts of the world. Orbital AI won't replace Earth-based systems any time soon, but it might become a floating scaffold of intelligence designed to supplement and stabilize the digital terrain, even if it's hundreds of miles above any actual terrain.
[8]
Google's plan to put data centers in the sky faces thousands of (little) problems: space junk | Fortune
The rapid expansion of artificial intelligence and cloud services has led to a massive demand for computing power. The surge has strained data infrastructure, which requires lots of electricity to operate. A single, medium-sized data center here on Earth can consume enough electricity to power about 16,500 homes, with even larger facilities using as much as a small city. Over the past few years, tech leaders have increasingly advocated for space-based AI infrastructure as a way to address the power requirements of data centers. In space, sunshine - which solar panels can convert into electricity - is abundant and reliable. On Nov. 4, 2025, Google unveiled Project Suncatcher, a bold proposal to launch an 81-satellite constellation into low Earth orbit. It plans to use the constellation to harvest sunlight to power the next generation of AI data centers in space. So, instead of beaming power back to Earth, the constellation would beam data back to Earth. For example, if you asked a chatbot how to bake sourdough bread, instead of firing up a data center in Virginia to craft a response, your query would be beamed up to the constellation in space, processed by chips running purely on solar energy, and the recipe sent back down to your device. Doing so would mean leaving the substantial heat generated behind in the cold vacuum of space. As a technology entrepreneur, I applaud Google's ambitious plan. But as a space scientist, I predict that the company will soon have to reckon with a growing problem: space debris. Space debris - the collection of defunct human-made objects in Earth's orbit - is already affecting space agencies, companies and astronauts. This debris includes large pieces, such as spent rocket stages and dead satellites, as well as tiny flecks of paint and other fragments from discontinued satellites. Space debris travels at hypersonic speeds of approximately 17,500 miles per hour (28,000 km/h) in low Earth orbit. At this speed, colliding with a piece of debris the size of a blueberry would feel like being hit by a falling anvil. Satellite breakups and anti-satellite tests have created an alarming amount of debris, a crisis now exacerbated by the rapid expansion of commercial constellations such as SpaceX's Starlink. The Starlink network has more than 7,500 satellites, which provide global high-speed internet. The U.S. Space Force actively tracks over 40,000 objects larger than a softball using ground-based radar and optical telescopes. However, this number represents less than 1% of the lethal objects in orbit. The majority are too small for these telescopes to reliably identify and track. In November 2025, three Chinese astronauts aboard the Tiangong space station were forced to delay their return to Earth because their capsule had been struck by a piece of space debris. Back in 2018, a similar incident on the International Space Station challenged relations between the United States and Russia, as Russian media speculated that a NASA astronaut may have deliberately sabotaged the station. The orbital shell Google's project targets - a Sun-synchronous orbit approximately 400 miles (650 kilometers) above Earth - is a prime location for uninterrupted solar energy. At this orbit, the spacecraft's solar arrays will always be in direct sunshine, where they can generate electricity to power the onboard AI payload. But for this reason, Sun-synchronous orbit is also the single most congested highway in low Earth orbit, and objects in this orbit are the most likely to collide with other satellites or debris. As new objects arrive and existing objects break apart, low Earth orbit could approach Kessler syndrome. In this theory, once the number of objects in low Earth orbit exceeds a critical threshold, collisions between objects generate a cascade of new debris. Eventually, this cascade of collisions could render certain orbits entirely unusable. Project Suncatcher proposes a cluster of satellites carrying large solar panels. They would fly with a radius of just one kilometer, each node spaced less than 200 meters apart. To put that in perspective, imagine a racetrack roughly the size of the Daytona International Speedway, where 81 cars race at 17,500 miles per hour - while separated by gaps about the distance you need to safely brake on the highway. This ultradense formation is necessary for the satellites to transmit data to each other. The constellation splits complex AI workloads across all its 81 units, enabling them to "think" and process data simultaneously as a single, massive, distributed brain. Google is partnering with a space company to launch two prototype satellites by early 2027 to validate the hardware. But in the vacuum of space, flying in formation is a constant battle against physics. While the atmosphere in low Earth orbit is incredibly thin, it is not empty. Sparse air particles create orbital drag on satellites - this force pushes against the spacecraft, slowing it down and forcing it to drop in altitude. Satellites with large surface areas have more issues with drag, as they can act like a sail catching the wind. To add to this complexity, streams of particles and magnetic fields from the Sun - known as space weather - can cause the density of air particles in low Earth orbit to fluctuate in unpredictable ways. These fluctuations directly affect orbital drag. When satellites are spaced less than 200 meters apart, the margin for error evaporates. A single impact could not only destroy one satellite but send it blasting into its neighbors, triggering a cascade that could wipe out the entire cluster and randomly scatter millions of new pieces of debris into an orbit that is already a minefield. To prevent crashes and cascades, satellite companies could adopt a leave no trace standard, which means designing satellites that do not fragment, release debris or endanger their neighbors, and that can be safely removed from orbit. For a constellation as dense and intricate as Suncatcher, meeting this standard might require equipping the satellites with "reflexes" that autonomously detect and dance through a debris field. Suncatcher's current design doesn't include these active avoidance capabilities. In the first six months of 2025 alone, SpaceX's Starlink constellation performed a staggering 144,404 collision-avoidance maneuvers to dodge debris and other spacecraft. Similarly, Suncatcher would likely encounter debris larger than a grain of sand every five seconds. Today's object-tracking infrastructure is generally limited to debris larger than a softball, leaving millions of smaller debris pieces effectively invisible to satellite operators. Future constellations will need an onboard detection system that can actively spot these smaller threats and maneuver the satellite autonomously in real time. Equipping Suncatcher with active collision avoidance capabilities would be an engineering feat. Because of the tight spacing, the constellation would need to respond as a single entity. Satellites would need to reposition in concert, similar to a synchronized flock of birds. Each satellite would need to react to the slightest shift of its neighbor. Technological solutions, however, can go only so far. In September 2022, the Federal Communications Commission created a rule requiring satellite operators to remove their spacecraft from orbit within five years of the mission's completion. This typically involves a controlled de-orbit maneuver. Operators must now reserve enough fuel to fire the thrusters at the end of the mission to lower the satellite's altitude, until atmospheric drag takes over and the spacecraft burns up in the atmosphere. However, the rule does not address the debris already in space, nor any future debris, from accidents or mishaps. To tackle these issues, some policymakers have proposed a use-tax for space debris removal. A use-tax or orbital-use fee would charge satellite operators a levy based on the orbital stress their constellation imposes, much like larger or heavier vehicles paying greater fees to use public roads. These funds would finance active debris removal missions, which capture and remove the most dangerous pieces of junk. Avoiding collisions is a temporary technical fix, not a long-term solution to the space debris problem. As some companies look to space as a new home for data centers, and others continue to send satellite constellations into orbit, new policies and active debris removal programs can help keep low Earth orbit open for business.
[9]
Google CEO Sundar Pichai says data centers in space are coming
"We are taking our first step in '27," Pichai told Fox News. "We'll send tiny, tiny racks of machines, and have them in satellites, test them out, and then start scaling from there." The space pitch arrives when Earth is starting to look like a bad long-term landlord for the AI build-out. A 2024 Lawrence Berkeley National Laboratory report found that U.S. data centers already chew through about 4.4% of the country's electricity, and that share could climb to as much as 12% by 2028 as GPU farms multiply. McKinsey puts a price tag on the race to scale data centers: roughly $6.7 trillion in global data center capex by 2030, about $5 trillion of that aimed at AI-ready infrastructure. At some point, "just build another region" stops being a strategy and starts being an electrical engineering problem.
[10]
Google's Project Suncatcher could make the space debris problem a lot worse
Over the past few years, tech leaders have increasingly advocated for space-based AI infrastructure as a way to address the power requirements of data centers. In space, sunshine -- which solar panels can convert into electricity -- is abundant and reliable. On November 4, 2025, Google unveiled Project Suncatcher, a bold proposal to launch an 81-satellite constellation into low Earth orbit. It plans to use the constellation to harvest sunlight to power the next generation of AI data centers in space. So instead of beaming power back to Earth, the constellation would beam data back to Earth. For example, if you asked a chatbot how to bake sourdough bread, instead of firing up a data center in Virginia to craft a response, your query would be beamed up to the constellation in space, processed by chips running purely on solar energy, and the recipe sent back down to your device. Doing so would mean leaving the substantial heat generated behind in the cold vacuum of space.
Share
Share
Copy Link
Google unveiled Project Suncatcher, planning to launch solar-powered satellites carrying AI chips by 2027 to address artificial intelligence's soaring energy demands. The tech giant joins a growing wave of companies exploring orbital data centers, but experts warn of significant challenges including space debris risks, cooling systems in vacuum, and launch costs that need to drop below $200 per kilogram to become viable.
Google CEO Sundar Pichai believes space data centers could become "a more normal way to build data centers" within a decade, signaling a major shift in how tech companies approach artificial intelligence infrastructure
2
. The company's Project Suncatcher represents the most concrete step yet toward this vision, with plans to launch two prototype satellites carrying custom AI server chips into low Earth orbit by early 20271
. The initiative aims to tackle a pressing problem: data centers will account for nearly half of U.S. electricity demand growth between now and 2030, with global power requirements potentially doubling by decade's end as companies train larger AI models1
.
Source: Fast Company
The appeal of AI data centers in orbit lies in their access to practically unlimited solar energy without interruption from cloudy skies or nighttime darkness
1
. Google's orbital data center concept envisions an 81-satellite constellation in sun-synchronous orbit, positioned at approximately 400 miles above Earth where spacecraft always fly over places at sunset or sunrise3
. This positioning means solar arrays receive nearly continuous sunlight and gain efficiency advantages outside Earth's atmosphere5
. When someone asks a chatbot how to bake sourdough bread, instead of firing up a data center in Virginia, the query would be beamed to the constellation in space, processed by chips running purely on solar energy, and the recipe sent back down3
.Source: TechSpot
Google isn't alone in pursuing this vision. Startup Starcloud successfully launched a 60-kilogram satellite carrying an NVIDIA H100 GPU and expects to require five gigawatts of electric power by 2035
1
. The satellite is operational and undergoing commissioning to verify all functions work properly2
. Meanwhile, Aetherflux announced its "Galactic Brain" project, aiming to launch the first node of its constellation in the first quarter of 20274
. China has begun launching spacecraft for a Xingshidai "space data center" constellation, and the European Union is studying similar ideas under a project known as ASCEND1
. SpaceX CEO Elon Musk stated that "satellites with localized AI compute, where just the results are beamed back from low-latency, sun-synchronous orbit, will be the lowest cost way to generate AI bitstreams in <3 years"4
.The increasing energy demands of AI push companies toward space, but significant engineering challenges remain. Computing hardware must be protected from cosmic radiation through either shielding or error-correcting software
1
. Google's approach uses the same TPU chips deployed in its Earth data centers rather than heavily shielded space-grade hardware. Laboratory tests exposing these chips to proton beam radiation suggest they can tolerate almost three times the dose they'll receive in space5
. However, maintaining reliable performance for years amidst solar storms, debris, and temperature swings presents a far harder test.Cooling in a vacuum poses another major obstacle. On Earth, servers are cooled with air or water, but in space there's no air to carry heat away
2
. Orbital platforms need large radiators that can dump heat into the vacuum of space through thermal dissipation, adding significant mass that must be launched on rockets1
. NASA studies show radiators can account for more than 40% of total power system mass at high power levels5
. Starcloud says it's developing "a lightweight deployable radiator design with a very large area -- by far the largest radiators deployed in space"2
.Rocket launch costs alone pose a significant challenge to building large orbital data centers, not to mention the need to replace onboard chips every five to six years
1
. "Launch costs are dropping with reusable rockets, but we would still require a very large number of launches to build orbital data centers that are competitive with those on Earth," says Benjamin Lee, a computer architect at the University of Pennsylvania1
. Google's Suncatcher team estimates that liftoff costs would need to fall to under $200 per kilogram by 2035 for their vision to make economic sense—seven or eight times cheaper than today1
5
.Related Stories
Project Suncatcher targets sun-synchronous orbit, the single most congested highway in low Earth orbit where objects are most likely to collide with other satellites or debris
3
. Space debris travels at hypersonic speeds of approximately 17,500 miles per hour, meaning collision with a piece the size of a blueberry would feel like being hit by a falling anvil3
. The U.S. Space Force actively tracks over 40,000 objects larger than a softball, representing less than 1% of lethal objects in orbit3
. Google's proposed satellite constellation would fly 81 units within a one-kilometer radius, each spaced less than 200 meters apart—like cars racing at 17,500 miles per hour separated by highway braking distances3
. As new objects arrive and existing ones break apart, low Earth orbit could approach Kessler syndrome, where collisions generate cascading debris that renders certain orbits unusable3
.
Source: The Conversation
While Starcloud estimates that a solar-powered space data center could achieve 10 times lower carbon emissions compared with a land-based data center powered by natural gas generators, researchers at Saarland University calculated that an orbital data center could still create an order of magnitude greater emissions than one on Earth
1
. Most extra emissions come from burning rocket stages and hardware on reentry, says Andreas Schmidt, a computer scientist and co-author of the "Dirty Bits in Low-Earth Orbit" paper1
. The process forms pollutants that can further deplete Earth's protective ozone layer1
. Aetherflux's powerbeaming capabilities aim to enable energy collected in space to support not only compute in orbit but also power delivery on Earth through infrared lasers transmitted to ground stations4
.Summarized by
Navi
[1]
[3]
[5]
11 Dec 2025•Technology

04 Nov 2025•Technology

22 Oct 2025•Technology

1
Business and Economy

2
Business and Economy

3
Policy and Regulation
