8 Sources
8 Sources
[1]
Google CEO: Data Centers in Space Could Be the Norm in About a Decade
Running data centers in space might sound like science fiction, but Google's CEO is betting it'll become a practical way to handle AI's soaring energy demands, possibly within a "decade or so." "We want to put these data centers in space, closer to the Sun," allowing them to harness the solar energy in Earth's orbit, Sundar Pichai said in an interview with Fox News. "There's no doubt to me that in a decade or so...we'll be viewing it as a more normal way to build data centers." His statement signals that Google sees real promise in the concept. The tech giant debuted a prototype "research moonshot" called Project Suncatcher last month, and it plans on sending up a pair of test satellites in early 2027, each with custom AI server chips. From there, Google hopes to gradually scale up the technology, Pichai told Fox News. The concept might also spark a new space race, as Amazon founder Jeff Bezos and SpaceX CEO Elon Musk have also been discussing the possibility of operating data centers in Earth's orbit. In addition, a startup called Starcloud successfully sent up its own test satellite carrying an Nvidia H100 GPU into space. Starcloud CEO Philip Johnston told PCMag last month that the satellite is operational and has been undergoing the "commissioning" phase to verify that all functions are working properly. Still, a major challenge facing orbiting data centers is cooling the AI chips and protecting them from cosmic radiation. The vacuum of space means there's no air to carry the heat away. So, the AI chips will need a built-in cooling system that uses stored air or liquid. Starcloud says it's also developing a "lightweight deployable radiator design with a very large area -- by far the largest radiators deployed in space," with the goal of radiating the heat away toward deep space.
[2]
Google's proposed data center in orbit will face issues with space debris in an already crowded orbit
University of Michigan provides funding as a founding partner of The Conversation US. The rapid expansion of artificial intelligence and cloud services has led to a massive demand for computing power. The surge has strained data infrastructure, which requires lots of electricity to operate. A single, medium-sized data center here on Earth can consume enough electricity to power about 16,500 homes, with even larger facilities using as much as a small city. Over the past few years, tech leaders have increasingly advocated for space-based AI infrastructure as a way to address the power requirements of data centers. In space, sunshine - which solar panels can convert into electricity - is abundant and reliable. On Nov. 4, 2025, Google unveiled Project Suncatcher, a bold proposal to launch an 81-satellite constellation into low Earth orbit. It plans to use the constellation to harvest sunlight to power the next generation of AI data centers in space. So, instead of beaming power back to Earth, the constellation would beam data back to Earth. For example, if you asked a chatbot how to bake sourdough bread, instead of firing up a data center in Virginia to craft a response, your query would be beamed up to the constellation in space, processed by chips running purely on solar energy, and the recipe sent back down to your device. Doing so would mean leaving the substantial heat generated behind in the cold vacuum of space. As a technology entrepreneur, I applaud Google's ambitious plan. But as a space scientist, I predict that the company will soon have to reckon with a growing problem: space debris. The mathematics of disaster Space debris - the collection of defunct human-made objects in Earth's orbit - is already affecting space agencies, companies and astronauts. This debris includes large pieces, such as spent rocket stages and dead satellites, as well as tiny flecks of paint and other fragments from discontinued satellites. Space debris travels at hypersonic speeds of approximately 17,500 miles per hour (28,000 km/h) in low Earth orbit. At this speed, colliding with a piece of debris the size of a blueberry would feel like being hit by a falling anvil. Satellite breakups and anti-satellite tests have created an alarming amount of debris, a crisis now exacerbated by the rapid expansion of commercial constellations such as SpaceX's Starlink. The Starlink network has more than 7,500 satellites, which provide global high-speed internet. The U.S. Space Force actively tracks over 40,000 objects larger than a softball using ground-based radar and optical telescopes. However, this number represents less than 1% of the lethal objects in orbit. The majority are too small for these telescopes to reliably identify and track. In November 2025, three Chinese astronauts aboard the Tiangong space station were forced to delay their return to Earth because their capsule had been struck by a piece of space debris. Back in 2018, a similar incident on the International Space Station challenged relations between the United States and Russia, as Russian media speculated that a NASA astronaut may have deliberately sabotaged the station. The orbital shell Google's project targets - a Sun-synchronous orbit approximately 400 miles (650 kilometers) above Earth - is a prime location for uninterrupted solar energy. At this orbit, the spacecraft's solar arrays will always be in direct sunshine, where they can generate electricity to power the onboard AI payload. But for this reason, Sun-synchronous orbit is also the single most congested highway in low Earth orbit, and objects in this orbit are the most likely to collide with other satellites or debris. As new objects arrive and existing objects break apart, low Earth orbit could approach Kessler syndrome. In this theory, once the number of objects in low Earth orbit exceeds a critical threshold, collisions between objects generate a cascade of new debris. Eventually, this cascade of collisions could render certain orbits entirely unusable. Implications for Project Suncatcher Project Suncatcher proposes a cluster of satellites carrying large solar panels. They would fly with a radius of just one kilometer, each node spaced less than 200 meters apart. To put that in perspective, imagine a racetrack roughly the size of the Daytona International Speedway, where 81 cars race at 17,500 miles per hour - while separated by gaps about the distance you need to safely brake on the highway. This ultradense formation is necessary for the satellites to transmit data to each other. The constellation splits complex AI workloads across all its 81 units, enabling them to "think" and process data simultaneously as a single, massive, distributed brain. Google is partnering with a space company to launch two prototype satellites by early 2027 to validate the hardware. But in the vacuum of space, flying in formation is a constant battle against physics. While the atmosphere in low Earth orbit is incredibly thin, it is not empty. Sparse air particles create orbital drag on satellites - this force pushes against the spacecraft, slowing it down and forcing it to drop in altitude. Satellites with large surface areas have more issues with drag, as they can act like a sail catching the wind. To add to this complexity, streams of particles and magnetic fields from the Sun - known as space weather - can cause the density of air particles in low Earth orbit to fluctuate in unpredictable ways. These fluctuations directly affect orbital drag. When satellites are spaced less than 200 meters apart, the margin for error evaporates. A single impact could not only destroy one satellite but send it blasting into its neighbors, triggering a cascade that could wipe out the entire cluster and randomly scatter millions of new pieces of debris into an orbit that is already a minefield. The importance of active avoidance To prevent crashes and cascades, satellite companies could adopt a leave no trace standard, which means designing satellites that do not fragment, release debris or endanger their neighbors, and that can be safely removed from orbit. For a constellation as dense and intricate as Suncatcher, meeting this standard might require equipping the satellites with "reflexes" that autonomously detect and dance through a debris field. Suncatcher's current design doesn't include these active avoidance capabilities. In the first six months of 2025 alone, SpaceX's Starlink constellation performed a staggering 144,404 collision-avoidance maneuvers to dodge debris and other spacecraft. Similarly, Suncatcher would likely encounter debris larger than a grain of sand every five seconds. Today's object-tracking infrastructure is generally limited to debris larger than a softball, leaving millions of smaller debris pieces effectively invisible to satellite operators. Future constellations will need an onboard detection system that can actively spot these smaller threats and maneuver the satellite autonomously in real time. Equipping Suncatcher with active collision avoidance capabilities would be an engineering feat. Because of the tight spacing, the constellation would need to respond as a single entity. Satellites would need to reposition in concert, similar to a synchronized flock of birds. Each satellite would need to react to the slightest shift of its neighbor. Paying rent for the orbit Technological solutions, however, can go only so far. In September 2022, the Federal Communications Commission created a rule requiring satellite operators to remove their spacecraft from orbit within five years of the mission's completion. This typically involves a controlled de-orbit maneuver. Operators must now reserve enough fuel to fire the thrusters at the end of the mission to lower the satellite's altitude, until atmospheric drag takes over and the spacecraft burns up in the atmosphere. However, the rule does not address the debris already in space, nor any future debris, from accidents or mishaps. To tackle these issues, some policymakers have proposed a use-tax for space debris removal. A use-tax or orbital-use fee would charge satellite operators a levy based on the orbital stress their constellation imposes, much like larger or heavier vehicles paying greater fees to use public roads. These funds would finance active debris removal missions, which capture and remove the most dangerous pieces of junk. Avoiding collisions is a temporary technical fix, not a long-term solution to the space debris problem. As some companies look to space as a new home for data centers, and others continue to send satellite constellations into orbit, new policies and active debris removal programs can help keep low Earth orbit open for business.
[3]
Data centres in space: will 2027 really be the year AI goes to orbit?
Anglia Ruskin University (ARU) provides funding as a member of The Conversation UK. Google recently unveiled Project Suncatcher, a research "moonshot" aiming to build a data centre in space. The tech giant plans to use a constellation of solar-powered satellites which would run on its own TPU chips and transmit data to one another via lasers. Google's TPU chips (tensor processing units), which are specially designed for machine learning, are already powering Google's latest AI model, Gemini 3. Project Suncatcher will explore whether they can be adapted to survive radiation and temperature extremes and operate reliably in orbit. It aims to deploy two prototype satellites into low Earth orbit, some 400 miles above the Earth, in early 2027. Google's rivals are also exploring space-based computing. Elon Musk has said that SpaceX "will be doing data centres in space", suggesting that the next generation of Starlink satellites could be scaled up to host such processing. Several smaller firms, including a US startup called Starcloud, have also announced plans to launch satellites equipped with the GPU chips (graphics processing units) that are used in most AI systems. The logic of data centres in space is that they avoid many of the issues with their Earth-based equivalents, particularly around power and cooling. Space systems have a much lower environmental footprint and it's potentially easier to make them bigger. As Google CEO Sundar Pichai has said: "We will send tiny, tiny racks of machines and have them in satellites, test them out, and then start scaling from there ... There is no doubt to me that, a decade or so away, we will be viewing it as a more normal way to build data centres." Assuming Google does manage to launch a prototype in 2027, will it simply be a high-stakes technical experiment - or the dawning of a new era? The scale of the challenge I wrote an article for The Conversation at the start of 2025 laying out the challenges of putting data centres into space, in which I was cautious about them happening soon. Now, of course, Project Suncatcher represents a concrete programme rather than just an idea. This clarity, with a defined goal, launch date and hardware, marks a significant shift. The satellites' orbits will be "sun synchronous", meaning they'll always be flying over places at sunset or sunrise so that they can capture sunlight nearly continuously. According to Google, solar arrays in such orbits can generate significantly more energy per panel than typical installations on Earth because they avoid losing sunlight due to clouds and the atmosphere, as well as night times. The TPU tests will be fascinating. Whereas hardware designed for space normally requires to be heavily shielded against radiation and extreme temperatures, Google is using the same chips used in its Earth data centres. The company has already done laboratory tests exposing the chips to radiation from a proton beam that suggest they can tolerate almost three times the dose they'll receive in space. This is very promising, but maintaining a reliable performance for years, amidst solar storms, debris and temperature swings is a far harder test. Another challenge lies in thermal management. On Earth, servers are cooled with air or water. In space, there is no air and no straightforward way to dissipate heat. All heat must be removed through radiators, which often become among the largest and heaviest parts of a spacecraft. Nasa studies show that radiators can account for more than 40% of total power system mass at high power levels. Designing a compact system that can keep dense AI hardware within safe temperatures is one of the most difficult aspects of the Suncatcher concept. A space-based data centre must also replicate the high bandwidth, low latency network fabric of terrestrial data centres. If Google's proposed laser communication system (optical networking) is going to work at the multi-terabit capacity required, there are major engineering hurdles involved. These include maintaining the necessary alignment between fast-moving satellites and coping with orbital drift, where satellites move out of their intended orbit. The satellites will also have to sustain reliable ground links back on Earth and ovecome weather disruptions. If a space data-centre is to be viable for the long term, it will be vital that it avoids early failures. Maintenance is another unresolved issue. Terrestrial data centres rely on continual hardware servicing and upgrades. In orbit, repairs would require robotic servicing or additional missions, both of which are costly and complex. Then there is the uncertainty around economics. Space-based computing becomes viable only at scale, and only if launch costs fall significantly. Google's Project Suncatcher paper suggests that launch costs could drop below US$200 (Β£151) per kilogram by the mid 2030s, seven or eight times cheaper than today. That would put construction costs on par with some equivalent facilities on Earth. But if satellites require early replacement or if radiation shortens their lifespan, the numbers could look quite different. In short, a two-satellite test mission by 2027 sounds plausible. It could validate whether TPUs survive radiation and thermal stress, whether solar power is stable and whether the laser communication system performs as expected. However, even a successful demonstration would only be the first step. It would not show that large-scale orbital data centres are feasible. Full-scale systems would require solving all the challenges outlined above. If adoption occurs at all, it is likely to unfold over decades. For now, space-based computing remains what Google itself calls it, a moonshot: ambitious and technically demanding, but one that could reshape the future of AI infrastructure, not to mention our relationship with the cosmos around us.
[4]
Sundar Pichai says Google will deploy solar-powered data centers in space by 2027
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. Looking ahead: Google may not be taking as many audacious moonshots as it did in its early years, but it remains at the forefront of global technological transformation, shaping how people work, communicate, and live. The company is now planning to revolutionize cloud computing services by deploying solar-powered data centers in space by 2027. In an interview with Fox News over the weekend, Google CEO Sundar Pichai discussed the recently announced Project Suncatcher, which aims to find more efficient ways to power energy-hungry data centers by harnessing solar energy and potentially making them more sustainable than traditional facilities. The space-based data centers are expected to go online in a limited capacity in 2027, with Google planning to send "tiny racks of machines" into orbit on two prototype satellites through a tie-up with Planet. Pichai added that he expects extraterrestrial data centers to become fairly common within ten years, with companies building giant gigawatt-scale facilities in space to power the AI boom. Google announced Project Suncatcher last month, describing it as the best solution to the enormous power requirements of AI data centers. According to the company, the initiative is a "research moonshot" that will leverage solar energy to run satellite swarms powered by Tensor Processing Units (TPUs) that communicate with one another over laser links instead of fiber. A 2024 Lawrence Berkeley National Laboratory report found that US data centers already use more than 4% of the country's electricity - putting a massive strain on power grids, driving up prices, and fueling protests in rural communities. Alarmingly, that figure is expected to rise to 12% by 2028 amid the continuing AI boom. Google believes that scaling machine-learning compute in space will not only help meet AI data centers' insatiable demand for power, but also do so sustainably by substituting thermal power with solar. The company is highly optimistic about the project and has published a preprint paper outlining plans to launch an interconnected network of solar-powered satellites running its TPU-based AI chips. Google is not the only company bullish on data centers in space. Amazon founder Jeff Bezos recently claimed that space-based solar data centers will become common within the next 20 years and that they will also be more cost-effective than traditional data centers on Earth.
[5]
Google, Amazon, and xAI want to launch AI into space
Having AI overhead could enhance connectivity for everything from remote internet access to disaster response. In the span of just a few months, the push to put artificial intelligence in space has progressed from a long-term dream to an immediate, very real strategic priority. Google's Project Suncatcher, Amazon's Leo project for advancing satellite internet constellation, and Elon Musk's xAI exploration into space-based compute environments all point toward the same thing: the next great leap for AI might not happen on land, but in low Earth orbit. Outrageous as it may seem, there's a lot of real engineering underneath the glossy press releases and visionary quotes. The efforts are spurred on by the very real infrastructure crunch faced by AI developers as models expand and demand skyrockets. It's intense enough that the data centers, fiber networks, and power grids supporting the world's digital spine are starting to show strain. New energy sources struggle to keep up. And that's before factoring in reasons like latency, climate risks, and political barriers as motivation. Google's play, Project Suncatcher, is aiming to build orbital compute nodes powered by near-constant solar exposure and cooled by the vacuum of space. The idea is that these sun-drenched satellites full of Google's Tensor Processing Units could eventually run machine learning models more efficiently than ground-based data centers, especially for tasks that don't require real-time human interaction. Solar panels work better in orbit. Cooling is easier. And there's no storm or blackout to knock them offline. With Amazon Leo, the company is building out a global broadband network of thousands of low Earth orbit satellites that will eventually link to cloud and AI infrastructure. Some of those satellites may one day support edge computing for AI tasks in places with limited or no access to the cloud. Meanwhile, Elon Musk is sketching out concepts for orbital compute farms for xAI and SpaceX to tackle. They wouldn't just run models, but train them. That's a much harder technical challenge, but one that might make sense for resource-intensive tasks that benefit from uninterrupted energy and physical isolation. If you're trying to train a multi-trillion parameter model without bumping into terrestrial bandwidth caps or infrastructure bottlenecks, space starts looking pretty good. These projects could make a huge difference for a lot of people. Rural school systems could access fast cloud tools, and weather monitoring systems could extrapolate using real-time orbital AI to predict flash floods and reroute aid. And with solar-powered nodes running in space, companies might rely less on carbon-heavy terrestrial grids. Space-based energy providers have been discussed since before there was a space program. It might be that AI demand is the tipping point to invest in such a project. Of course, space is far from forgiving or cheap to operate from. Launching hardware is expensive, and radiation-shielding is hard. Coordination of thousands of satellites can cause orbital traffic jams. There's also the question of who owns the infrastructure, who gets to use it, and whether it becomes yet another layer of centralized control in the tech ecosystem. Governments, naturally, are watching closely. From a user perspective, though, the shift may be mostly invisible at first. You won't log into a 'space version' of your favorite app, but you might notice things loading faster, and you might start seeing services in previously unconnected parts of the world. Orbital AI won't replace Earth-based systems any time soon, but it might become a floating scaffold of intelligence designed to supplement and stabilize the digital terrain, even if it's hundreds of miles above any actual terrain.
[6]
Google's plan to put data centers in the sky faces thousands of (little) problems: space junk | Fortune
The rapid expansion of artificial intelligence and cloud services has led to a massive demand for computing power. The surge has strained data infrastructure, which requires lots of electricity to operate. A single, medium-sized data center here on Earth can consume enough electricity to power about 16,500 homes, with even larger facilities using as much as a small city. Over the past few years, tech leaders have increasingly advocated for space-based AI infrastructure as a way to address the power requirements of data centers. In space, sunshine - which solar panels can convert into electricity - is abundant and reliable. On Nov. 4, 2025, Google unveiled Project Suncatcher, a bold proposal to launch an 81-satellite constellation into low Earth orbit. It plans to use the constellation to harvest sunlight to power the next generation of AI data centers in space. So, instead of beaming power back to Earth, the constellation would beam data back to Earth. For example, if you asked a chatbot how to bake sourdough bread, instead of firing up a data center in Virginia to craft a response, your query would be beamed up to the constellation in space, processed by chips running purely on solar energy, and the recipe sent back down to your device. Doing so would mean leaving the substantial heat generated behind in the cold vacuum of space. As a technology entrepreneur, I applaud Google's ambitious plan. But as a space scientist, I predict that the company will soon have to reckon with a growing problem: space debris. Space debris - the collection of defunct human-made objects in Earth's orbit - is already affecting space agencies, companies and astronauts. This debris includes large pieces, such as spent rocket stages and dead satellites, as well as tiny flecks of paint and other fragments from discontinued satellites. Space debris travels at hypersonic speeds of approximately 17,500 miles per hour (28,000 km/h) in low Earth orbit. At this speed, colliding with a piece of debris the size of a blueberry would feel like being hit by a falling anvil. Satellite breakups and anti-satellite tests have created an alarming amount of debris, a crisis now exacerbated by the rapid expansion of commercial constellations such as SpaceX's Starlink. The Starlink network has more than 7,500 satellites, which provide global high-speed internet. The U.S. Space Force actively tracks over 40,000 objects larger than a softball using ground-based radar and optical telescopes. However, this number represents less than 1% of the lethal objects in orbit. The majority are too small for these telescopes to reliably identify and track. In November 2025, three Chinese astronauts aboard the Tiangong space station were forced to delay their return to Earth because their capsule had been struck by a piece of space debris. Back in 2018, a similar incident on the International Space Station challenged relations between the United States and Russia, as Russian media speculated that a NASA astronaut may have deliberately sabotaged the station. The orbital shell Google's project targets - a Sun-synchronous orbit approximately 400 miles (650 kilometers) above Earth - is a prime location for uninterrupted solar energy. At this orbit, the spacecraft's solar arrays will always be in direct sunshine, where they can generate electricity to power the onboard AI payload. But for this reason, Sun-synchronous orbit is also the single most congested highway in low Earth orbit, and objects in this orbit are the most likely to collide with other satellites or debris. As new objects arrive and existing objects break apart, low Earth orbit could approach Kessler syndrome. In this theory, once the number of objects in low Earth orbit exceeds a critical threshold, collisions between objects generate a cascade of new debris. Eventually, this cascade of collisions could render certain orbits entirely unusable. Project Suncatcher proposes a cluster of satellites carrying large solar panels. They would fly with a radius of just one kilometer, each node spaced less than 200 meters apart. To put that in perspective, imagine a racetrack roughly the size of the Daytona International Speedway, where 81 cars race at 17,500 miles per hour - while separated by gaps about the distance you need to safely brake on the highway. This ultradense formation is necessary for the satellites to transmit data to each other. The constellation splits complex AI workloads across all its 81 units, enabling them to "think" and process data simultaneously as a single, massive, distributed brain. Google is partnering with a space company to launch two prototype satellites by early 2027 to validate the hardware. But in the vacuum of space, flying in formation is a constant battle against physics. While the atmosphere in low Earth orbit is incredibly thin, it is not empty. Sparse air particles create orbital drag on satellites - this force pushes against the spacecraft, slowing it down and forcing it to drop in altitude. Satellites with large surface areas have more issues with drag, as they can act like a sail catching the wind. To add to this complexity, streams of particles and magnetic fields from the Sun - known as space weather - can cause the density of air particles in low Earth orbit to fluctuate in unpredictable ways. These fluctuations directly affect orbital drag. When satellites are spaced less than 200 meters apart, the margin for error evaporates. A single impact could not only destroy one satellite but send it blasting into its neighbors, triggering a cascade that could wipe out the entire cluster and randomly scatter millions of new pieces of debris into an orbit that is already a minefield. To prevent crashes and cascades, satellite companies could adopt a leave no trace standard, which means designing satellites that do not fragment, release debris or endanger their neighbors, and that can be safely removed from orbit. For a constellation as dense and intricate as Suncatcher, meeting this standard might require equipping the satellites with "reflexes" that autonomously detect and dance through a debris field. Suncatcher's current design doesn't include these active avoidance capabilities. In the first six months of 2025 alone, SpaceX's Starlink constellation performed a staggering 144,404 collision-avoidance maneuvers to dodge debris and other spacecraft. Similarly, Suncatcher would likely encounter debris larger than a grain of sand every five seconds. Today's object-tracking infrastructure is generally limited to debris larger than a softball, leaving millions of smaller debris pieces effectively invisible to satellite operators. Future constellations will need an onboard detection system that can actively spot these smaller threats and maneuver the satellite autonomously in real time. Equipping Suncatcher with active collision avoidance capabilities would be an engineering feat. Because of the tight spacing, the constellation would need to respond as a single entity. Satellites would need to reposition in concert, similar to a synchronized flock of birds. Each satellite would need to react to the slightest shift of its neighbor. Technological solutions, however, can go only so far. In September 2022, the Federal Communications Commission created a rule requiring satellite operators to remove their spacecraft from orbit within five years of the mission's completion. This typically involves a controlled de-orbit maneuver. Operators must now reserve enough fuel to fire the thrusters at the end of the mission to lower the satellite's altitude, until atmospheric drag takes over and the spacecraft burns up in the atmosphere. However, the rule does not address the debris already in space, nor any future debris, from accidents or mishaps. To tackle these issues, some policymakers have proposed a use-tax for space debris removal. A use-tax or orbital-use fee would charge satellite operators a levy based on the orbital stress their constellation imposes, much like larger or heavier vehicles paying greater fees to use public roads. These funds would finance active debris removal missions, which capture and remove the most dangerous pieces of junk. Avoiding collisions is a temporary technical fix, not a long-term solution to the space debris problem. As some companies look to space as a new home for data centers, and others continue to send satellite constellations into orbit, new policies and active debris removal programs can help keep low Earth orbit open for business.
[7]
Google CEO Sundar Pichai says data centers in space are coming
"We are taking our first step in '27," Pichai told Fox News. "We'll send tiny, tiny racks of machines, and have them in satellites, test them out, and then start scaling from there." The space pitch arrives when Earth is starting to look like a bad long-term landlord for the AI build-out. A 2024 Lawrence Berkeley National Laboratory report found that U.S. data centers already chew through about 4.4% of the country's electricity, and that share could climb to as much as 12% by 2028 as GPU farms multiply. McKinsey puts a price tag on the race to scale data centers: roughly $6.7 trillion in global data center capex by 2030, about $5 trillion of that aimed at AI-ready infrastructure. At some point, "just build another region" stops being a strategy and starts being an electrical engineering problem.
[8]
Google's Project Suncatcher could make the space debris problem a lot worse
Over the past few years, tech leaders have increasingly advocated for space-based AI infrastructure as a way to address the power requirements of data centers. In space, sunshine -- which solar panels can convert into electricity -- is abundant and reliable. On November 4, 2025, Google unveiled Project Suncatcher, a bold proposal to launch an 81-satellite constellation into low Earth orbit. It plans to use the constellation to harvest sunlight to power the next generation of AI data centers in space. So instead of beaming power back to Earth, the constellation would beam data back to Earth. For example, if you asked a chatbot how to bake sourdough bread, instead of firing up a data center in Virginia to craft a response, your query would be beamed up to the constellation in space, processed by chips running purely on solar energy, and the recipe sent back down to your device. Doing so would mean leaving the substantial heat generated behind in the cold vacuum of space.
Share
Share
Copy Link
Google CEO Sundar Pichai announced that data centers in space could become routine within a decade as the company prepares to launch Project Suncatcher prototype satellites in 2027. The initiative aims to harness solar energy in orbit to power artificial intelligence workloads, addressing the massive energy consumption of AI infrastructure. Amazon's Jeff Bezos and Elon Musk are also exploring similar concepts.
Google has revealed plans to deploy solar-powered data centers in space, with CEO Sundar Pichai predicting that orbital computing facilities will become a standard approach within roughly a decade. The company's Project Suncatcher initiative represents a bold response to artificial intelligence power requirements that are straining terrestrial infrastructure
1
. Pichai told Fox News that Google wants to position these facilities "closer to the Sun" to harness abundant solar energy in Earth's orbit, with prototype satellites scheduled for launch in early 20274
.
Source: Fast Company
The scale of the energy challenge is staggering. A single medium-sized data center on Earth consumes enough electricity to power approximately 16,500 homes, with larger facilities using as much as a small city
2
. According to a 2024 Lawrence Berkeley National Laboratory report, US data centers already consume more than 4% of the country's electricity, a figure expected to rise to 12% by 2028 amid the continuing AI boom4
.Google plans to launch an 81-satellite constellation into low Earth orbit, approximately 400 miles above Earth, where the satellites will fly in sun-synchronous orbits
2
. This positioning ensures solar arrays remain in direct sunshine nearly continuously, avoiding the losses from clouds, atmosphere, and nighttime that affect terrestrial installations3
. The constellation will use Tensor Processing Units, Google's custom machine learning chips that already power the Gemini 3 AI model, connected via laser communication systems rather than traditional fiber4
.Source: TechSpot
The proposed formation is remarkably dense, with 81 satellites flying within a one-kilometer radius, each node spaced less than 200 meters apart. This tight clustering enables the satellites to split complex AI workloads across all units, processing data simultaneously as a single distributed system
2
. Google is partnering with space company Planet to launch two prototype satellites by early 2027 to validate the hardware before scaling up4
.Google isn't alone in pursuing space-based infrastructure. Amazon founder Jeff Bezos recently claimed that space-based solar data centers will become common within 20 years and prove more cost-effective than traditional facilities
4
. Amazon's Leo project aims to build a global satellite constellation that will eventually support edge computing for artificial intelligence tasks in areas with limited cloud access5
.Elon Musk has indicated that SpaceX "will be doing data centres in space," suggesting next-generation Starlink satellites could be scaled up to host processing capabilities
3
. His xAI venture is exploring orbital compute farms not just for running models but for training them, a significantly harder technical challenge that could benefit from uninterrupted energy and physical isolation5
. Startup Starcloud has already sent a test satellite carrying an Nvidia H100 GPU into space, with CEO Philip Johnston confirming the satellite is operational and undergoing commissioning1
.Major engineering obstacles remain before orbital data centers become practical. Cooling in a vacuum presents a critical challenge since there's no air to carry heat away from processors. AI chips will require built-in cooling systems using stored air or liquid, along with massive radiators to dissipate heat toward deep space
1
. Starcloud is developing what it describes as "by far the largest radiators deployed in space" to address this issue1
. NASA studies indicate radiators can account for more than 40% of total power system mass at high power levels3
.Space radiation poses another threat to electronics. Google has conducted laboratory tests exposing its chips to proton beam radiation, with results suggesting they can tolerate almost three times the dose they'll receive in space
3
. However, maintaining reliable performance for years amid solar storms and temperature extremes represents a far more demanding test than controlled laboratory conditions.Related Stories
The target orbital shell for Project Suncatcher faces significant space debris risks. Sun-synchronous orbit is the single most congested region in low Earth orbit, making satellites there most likely to collide with other objects
2
. The U.S. Space Force actively tracks over 40,000 objects larger than a softball, but this represents less than 1% of potentially lethal objects in orbit, as most are too small for ground-based radar and optical telescopes to reliably identify2
.
Source: The Conversation
Space debris travels at approximately 17,500 miles per hour in low Earth orbit, meaning a collision with debris the size of a blueberry would impact with the force of a falling anvil
2
. Recent incidents highlight the real danger: in November 2025, three Chinese astronauts aboard the Tiangong space station delayed their return after their capsule was struck by debris2
. The rapid expansion of commercial constellations like SpaceX's Starlink network, which now has more than 7,500 satellites, has exacerbated the crisis2
.Experts warn that low Earth orbit could approach Kessler syndrome, where collisions between objects generate cascading debris that eventually renders certain orbits unusable
2
. Orbital traffic management will become increasingly critical as more companies deploy satellite constellations.The business case for orbital computing depends heavily on launch costs declining substantially. Google's Project Suncatcher paper suggests launch costs could drop below $200 per kilogram by the mid-2030s, roughly seven to eight times cheaper than current rates, which would make construction costs comparable to terrestrial facilities
3
. Pichai expects companies to eventually build "giant gigawatt-scale facilities in space to power the AI boom"4
.The environmental impact could be significant. Google believes scaling machine-learning compute in space will help meet AI data centers' power demands sustainably by substituting thermal power with solar energy
4
. Space-based systems have a potentially lower environmental footprint than terrestrial equivalents and may be easier to scale3
. With solar-powered nodes running in orbit, companies might reduce reliance on carbon-heavy terrestrial grids5
.For users, the shift may be largely invisible initially. Rather than logging into a "space version" of applications, people might notice faster loading times and services reaching previously unconnected parts of the world
5
. Rural school systems could access cloud tools, and weather monitoring systems could use real-time orbital AI to predict flash floods and coordinate disaster response5
. However, questions remain about ownership, access rights, and whether orbital infrastructure becomes another layer of centralized control, with governments watching developments closely5
.Summarized by
Navi
[2]
[3]
[5]
04 Nov 2025β’Technology

22 Oct 2025β’Technology

31 Oct 2025β’Technology
