7 Sources
7 Sources
[1]
Meet Project Suncatcher, Google's plan to put AI data centers in space
The tech industry is on a tear, building data centers for AI as quickly as they can buy up the land. The sky-high energy costs and logistical headaches of managing all those data centers have prompted interest in space-based infrastructure. Moguls like Jeff Bezos and Elon Musk have mused about putting GPUs in space, and now Google confirms it's working on its own version of the technology. The company's latest "moonshot" is known as Project Suncatcher, and if all goes as planned, Google hopes it will lead to scalable networks of orbiting TPUs. The space around earth has changed a lot in the last few years. A new generation of satellite constellations like Starlink has shown it's feasible to relay Internet communication via orbital systems. Deploying high-performance AI accelerators in space along similar lines would be a boon to the industry's never-ending build-out. Google notes that space may be "the best place to scale AI compute." Google's vision for scalable orbiting data centers relies on solar-powered satellites with free-space optical links connecting the nodes into a distributed network. Naturally, there are numerous engineering challenges to solve before Project Suncatcher is real. As a reference, Google points to the long road from its first moonshot self-driving cars 15 years ago to the Waymo vehicles that are almost fully autonomous today. Taking AI to space Some of the benefits are obvious. Google's vision for Suncatcher, as explained in a pre-print study (PDF), would place the satellites in a dawn-dusk sun-synchronous low-earth orbit. That ensures they would get almost constant sunlight exposure (hence the name). The cost of electricity on Earth is a problem for large data centers, and even moving them all to solar power wouldn't get the job done. Google notes solar panels are up to eight times more efficient in orbit than they are on the surface of Earth. Lots of uninterrupted sunlight at higher efficiency means more power for data processing. A major sticking point is how you can keep satellites connected at high speeds as they orbit. On Earth, the nodes in a data center communicate via blazing-fast optical interconnect chips. Maintaining high-speed communication among the orbiting servers will require wireless solutions that can operate at tens of terabits per second. Early testing on Earth has demonstrated bidirectional speeds up to 1.6 Tbps -- Google believes this can be scaled up over time. However, there is the problem of physics. Received power decreases with the square of distance, so Google notes the satellites would have to maintain proximity of a kilometer or less. That would require a tighter formation than any currently operational constellation, but it should be workable. Google has developed analytical models suggesting that satellites positioned several hundred meters apart would require only "modest station-keeping maneuvers." Hardware designed for space is expensive and often less capable compared to terrestrial systems because the former needs to be hardened against extreme temperatures and radiation. Google's approach to Project Suncatcher is to reuse the components used on Earth, which might not be very robust when you stuff them in a satellite. However, innovations like the Snapdragon-powered Mars Ingenuity helicopter have shown that off-the-shelf hardware may survive longer in space than we thought. Google says Suncatcher only works if TPUs can run for at least five years, which works out to 750 rad. The company is testing this by blasting its latest v6e Cloud TPU (Trillium) in a 67MeV proton beam. Google says while the memory was most vulnerable to damage, the experiments showed that TPUs can handle about three times as much radiation (almost 2 krad) before data corruption was detected. Google hopes to launch a pair of prototype satellites with TPUs by early 2027. It expects the launch cost of these first AI orbiters to be quite high. However, Google is planning for the mid-2030s when launch costs are projected to drop to as little as $200 per kilogram. At that level, space-based data centers could become as economical as the terrestrial versions. The fact is, terrestrial data centers are dirty, noisy, and ravenous for power and water. This has led many communities to oppose plans to build them near the places where people live and work. Putting them in space could solve everyone's problems (unless you're an astronomer).
[2]
Google Eyes Space-Based Data Centers With 'Project Suncatcher'
When he's not battling bugs and robots in Helldivers 2, Michael is reporting on AI, satellites, cybersecurity, PCs, and tech policy. Will the next space race feature tech giants rushing to deploy data centers in the sky? Google today announced an effort to launch satellites equipped with the company's AI chips -- something SpaceX CEO Elon Musk and Amazon founder Jeff Bezos are also pursuing. Google describes Project Suncatcher as a "research moonshot to one day scale machine learning in space." Like others, Google sees potential in harnessing the Sun to power a new class of data centers that orbit the Earth rather than consume energy here on Earth. Google will initially launch two prototype satellites in early 2027. Each one will be outfitted with the company's custom AI chips, called the Tensor Processing Unit, which are available through Google's ground-based data centers. "Our TPUs are headed to space!" CEO Sundar Pichai tweeted. Google is enlisting satellite company and imaging provider Planet Labs to help build the Suncatcher hardware. In the meantime, Pichai noted that his company has already simulated how the TPUs might perform in space. "Early research shows our Trillium-generation TPUs (our Tensor processing units, purpose-built for AI) survived without damage when tested in a particle accelerator to simulate low-Earth orbit levels of radiation," he wrote. The company published more details in a preprint paper and research post, which reveal that Google envisions potentially using "fleets of satellites equipped with solar arrays" to tap the Sun's energy in "near-constant sunlight." To rival the computing capabilities of Earth-based data centers, the satellites could transmit data to each other with the help of space-based lasers, which SpaceX's Starlink already supports. Google will need to overcome some major challenges, though, including "thermal management, high-bandwidth ground communications, and on-orbit system reliability," along with the costs. The paper largely sidesteps a major concern about orbiting data centers: The vacuum of space complicates cooling due to the absence of air to carry away heat. Google offers only a brief statement: "Cooling would be achieved through a thermal system of heat pipes and radiators while operating at nominal temperatures." Even so, the company's research paper concludes "in the long run [space-based data centers] may be the most scalable solution, with the additional benefit of minimizing the impact on terrestrial resources such as land and water." Google won't be alone in developing the technology. Musk tweeted last week about pursuing the same ambition, saying his company already has a foundation in Starlink, the satellite internet constellation. In addition, a startup called Starcloud successfully launched its first test satellite, outfitted with an Nvidia AI GPU, this past weekend, with the goal of developing its own network orbiting data centers. Meanwhile, Google's research paper notes it would need to rely on SpaceX, its future reusable rockets, and the lower launch costs to deploy its own orbiting data centers. "Our analysis of historical and projected launch pricing data suggests that with a sustained learning rate, prices may fall to less than $200/kg by the mid-2030s," the research post added. "At that price point, the cost of launching and operating a space-based data center could become roughly comparable to the reported energy costs of an equivalent terrestrial data center on a per-kilowatt/year basis."
[3]
Google's next moonshot is putting TPUs in space with 'Project Suncatcher'
Google is starting a new research moonshot called Project Suncatcher to "one day scale machine learning in space." This would involve Google Tensor Processing Unit (TPU) AI chips being placed onboard an interconnected network of satellites to "harness the full power of the Sun." Specifically, a "solar panel can be up to 8 times more productive than on earth" for near-continuous power using a "dawn-dusk sun-synchronous low earth orbit" that reduces the need for batteries and other power generation. In the future, space may be the best place to scale AI compute. These satellites would connect via free-space optical links, with large-scale ML workloads "distributing tasks across numerous accelerators with high-bandwidth, low-latency connections." To match data centers on Earth, the connection between satellites would have to be tens of terabits per second, and they'd have to fly in "very close formation (kilometers or less)." ...with satellites positioned just hundreds of meters apart, we will likely only require modest station-keeping maneuvers to maintain stable constellations within our desired sun-synchronous orbit. Google has already conducted radiation testing on TPUs (Trillium, v6e), with "promising" results: While the High Bandwidth Memory (HBM) subsystems were the most sensitive component, they only began showing irregularities after a cumulative dose of 2 krad(Si) -- nearly three times the expected (shielded) five year mission dose of 750 rad(Si). No hard failures were attributable to TID up to the maximum tested dose of 15 krad(Si) on a single chip, indicating that Trillium TPUs are surprisingly radiation-hard for space applications. Finally, Google believes that launch costs will "fall to less than $200/kg by the mid-2030s." At that point, the "cost of launching and operating a space-based data center could become roughly comparable to the reported energy costs of an equivalent terrestrial data center on a per-kilowatt/year basis." Our initial analysis shows that the core concepts of space-based ML compute are not precluded by fundamental physics or insurmountable economic barriers. Google still has to work through engineering challenges like thermal management, high-bandwidth ground communications, and on-orbit system reliability. It's partnering with Planet to launch two prototype satellites by early 2027 that test how "models and TPU hardware operate in space and validate the use of optical inter-satellite links for distributed ML tasks." More details are available in "Towards a future space-based, highly scalable AI infrastructure system design."
[4]
Google plans to put datacentres in space to meet demand for AI
US technology company's engineers want to exploit solar power and the falling cost of rocket launches Google is hatching plans to put artificial intelligence datacentres into space, with its first trial equipment sent into orbit in early 2027. Its scientists and engineers believe tightly packed constellations of about 80 solar-powered satellites could be arranged in orbit about 400 miles above the Earth's surface equipped with the powerful processors required to meet rising demand for AI. Prices of space launches are falling so quickly that by the middle of the 2030s the running costs of a space-based datacentre could be comparable to one on Earth, according to Google research released on Tuesday. Using satellites could also minimise the impact on the land and water resources needed to cool existing datacentres. Once in orbit, the datacentres would be powered by solar panels that can be up to eight times more productive than those on Earth. However, launching a single rocket into space emits hundreds of tonnes of CO. Objections could be raised by astronomers concerned that rising numbers of satellites in low orbit are "like bugs on a windshield" when they are trying to peer into the universe. The orbiting datacentres envisaged under Project Suncatcher would beam their results back through optical links, which typically use light or laser beams to transmit information. Major technology companies pursuing rapid advances in AI are projected to spend $3tn (£2.3tn) on earthbound datacentres from India to Texas and from Lincolnshire to Brazil. The spending has fueled rising concern about the impact on carbon emissions if clean energy is not found to power the sites. "In the future, space may be the best place to scale AI computers," Google said. "Working backward from there, our new research moonshot, Project Suncatcher, envisions compact constellations of solar-powered satellites, carrying Google TPUs and connected by free-space optical links. This approach would have tremendous potential for scale, and also minimises impact on terrestrial resources." TPUs are processors optimised for training and the day-to-day use of AI models. Free-space optical links deliver wireless transmission. Elon Musk, who runs the Starlink satellite internet provider and the SpaceX rocket programme, last week said his companies would start scaling up to create datacentres in space. Nvidia AI chips will also be launched into space later this month in partnership with the startup Starcloud. "In space, you get almost unlimited, low-cost renewable energy," said Philip Johnston, co-founder of the startup. "The only cost on the environment will be on the launch, then there will be 10 times carbon dioxide savings over the life of the datacentre compared with powering the datacentre terrestrially." Google is planning to launch two prototype satellites by early 2027 and said its research results were a "first milestone towards a scalable space-based AI". But it sounded a cautionary note: "Significant engineering challenges remain, such as thermal management, high-bandwidth ground communications and on-orbit system reliability."
[5]
Meet Project Suncatcher, a research moonshot to scale machine learning compute in space.
Artificial intelligence is a foundational technology that could help us tackle humanity's greatest challenges. Now, we're asking where we can go next to unlock its fullest potential. Today we're announcing Project Suncatcher, our new research moonshot to one day scale machine learning in space. Working backward from this potential future, we're exploring how an interconnected network of solar-powered satellites, equipped with our Tensor Processing Unit (TPU) AI chips, could harness the full power of the Sun. Inspired by other Google moonshots like autonomous vehicles and quantum computing, we've begun work on the foundational work needed to one day make this future possible. We're excited that this is a growing area of exploration, and our initial research, shared today in a preprint paper, describes our approach to satellite constellation design, control, and communication, and also our initial learnings from radiation testing Google TPUs. Our next step is a learning mission in partnership with Planet to launch two prototype satellites by early 2027 that will test our hardware in orbit, laying the groundwork for a future era of massively-scaled computation in space.
[6]
Google Is Sending Its TPUs to Space to Build Solar-Powered Data Centres | AIM
Google has announced a new research initiative, Project Suncatcher, which aims to explore the feasibility of scaling artificial intelligence (AI) compute in space using solar-powered satellite constellations equipped with Tensor Processing Units (TPUs). "Inspired by our history of moonshots, from quantum computing to autonomous driving, Project Suncatcher is exploring how we could one day build scalable ML compute systems in space, harnessing more of the sun's power (which emits more power than 100 trillion times humanity's total electricity production)," said Google CEO Sundar Pichai, in a post on X. ' The project, led by Travis Beals, senior director of Paradigms of Intelligence at Google, proposes a system where solar-powered satellites in low Earth orbit (LEO) perform machine learning (ML) workloads while communicating through free-space optical links. "Space may be the best place to scale AI compute," Beals said in a statement. "In the right orbit, a solar panel can be up to 8 times more productive than on Earth, and produce power nearly continuously, reducing the need for batteries." According to a preprint paper released alongside the announcement -- Towards a future space-based, highly scalable AI infrastructure system design -- the initiative focuses on building modular, interconnected satellite networks that can function like data centres in orbit. The proposed system would operate in a dawn-dusk sun-synchronous orbit, ensuring near-constant exposure to sunlight and reducing reliance on heavy batteries. To achieve data centre-level performance, the satellites would need to support inter-satellite links capable of tens of terabits per second. Google's researchers said they have already achieved 1.6 terabits per second transmission in lab conditions using a single optical transceiver pair. Achieving such bandwidth in orbit requires satellites to fly in close formations, just a few hundred meters apart. "With satellites positioned this closely, only modest station-keeping manoeuvres would be needed to maintain stability," the paper noted. Radiation tolerance was another major focus. Tests conducted on Google's Trillium v6e Cloud TPU chips under a 67 MeV proton beam showed that the chips could withstand nearly three times the expected five-year mission radiation dose without failure. Historically, launch costs have made large-scale space infrastructure unviable. However, Google's analysis suggests that if launch costs fall below $200 per kilogram -- as projected by the mid-2030s -- operating a space-based AI system could be cost-competitive with terrestrial data centres. The company's next milestone is a partnership with Planet Labs to launch two prototype satellites by early 2027. The mission will test TPU performance in orbit and evaluate optical inter-satellite links for distributed ML tasks. "Our initial analysis shows that the core concepts of space-based ML compute are not precluded by fundamental physics or insurmountable economic barriers," Beals said. "Significant engineering challenges remain, such as thermal management, high-bandwidth ground communications, and on-orbit system reliability."
[7]
Google unveils 2027 plan: Launch solar-powered AI data centres into space - here's how it'll work
Google AI data centres in space: Google is planning to launch AI datacenters into space with solar-powered satellites by early 2027, aiming to address the growing demand for AI computing power. This initiative, Project Suncatcher, could offer cost-effective and environmentally beneficial solutions compared to terrestrial facilities, though it raises concerns about space debris and astronomical interference. Google AI data centres in space: Google is exploring a bold new frontier for artificial intelligence, planning to send AI datacentres into space, with its first trial satellites expected to launch in early 2027. The initiative, dubbed Project Suncatcher, aims to place tightly packed constellations of around 80 solar-powered satellites about 400 miles above Earth, each carrying powerful processors designed to handle the growing demand for AI, as per a report. According to Google research released on Tuesday, falling launch costs could make space-based datacentres as cost-effective as those on Earth by the mid-2030s. In addition to scaling computing power, moving datacentres into orbit could reduce the strain on land and water resources used to cool terrestrial facilities. ALSO READ: 300 million at risk: Goldman Sachs lists top jobs that AI can replace, but plumbers and electricians may get rich Once in orbit, the satellites would be powered by highly efficient solar panels, up to eight times more productive than panels on Earth, as per The Guardian. Data would be transmitted back to the ground using optical links, which rely on light or laser beams. However, the plan raises environmental and astronomical concerns. Launching a single rocket emits hundreds of tonnes of CO2, and astronomers have warned that large numbers of satellites in low orbit could interfere with observations, comparing them to "bugs on a windshield," as per The Guardian report. ALSO READ: Liquidity panic? SOFR-IORB spread hits highest level since 2020 -- QE next? The initiative comes as major tech companies are projected to spend $3 trillion on Earth-based datacentres worldwide, from India to Texas and Lincolnshire to Brazil, sparking concerns over carbon emissions unless clean energy sources are used, as per the report. Philip Johnston, co-founder of startup Starcloud, which is partnering with Nvidia to launch AI chips into space later this month, highlighted the potential environmental benefits, saying, "In space, you get almost unlimited, low-cost renewable energy," adding, "The only cost on the environment will be on the launch, then there will be 10 times carbon dioxide savings over the life of the datacentre compared with powering the datacentre terrestrially," as quoted by The Guardian. ALSO READ: Why US market is down today? Key points investors need to take note as Dow, S&P 500 and Nasdaq fall Google plans to launch two prototype satellites by early 2027, calling the effort a "first milestone towards a scalable space-based AI," as per the report. However, the tech giant cautioned that, "Significant engineering challenges remain, such as thermal management, high-bandwidth ground communications and on-orbit system reliability," as quoted by The Guardian. ALSO READ: BTC crash alert: Why Bitcoin price dropped to $107,000 and why experts warn it could fall to $88,000 What is Google's Project Suncatcher? Google's plan to put AI datacentres into space using solar-powered satellites. Will space datacentres become cheaper than Earth-based ones? Yes, falling launch costs could make them comparable by the mid-2030s.
Share
Share
Copy Link
Google announces Project Suncatcher, a moonshot initiative to launch solar-powered satellites equipped with TPU AI chips into orbit by early 2027. The project aims to harness space's unlimited solar energy and overcome terrestrial data center limitations.
Google has unveiled Project Suncatcher, an ambitious research moonshot aimed at deploying artificial intelligence data centers in space using networks of solar-powered satellites equipped with the company's Tensor Processing Unit (TPU) chips
1
. The initiative represents Google's latest attempt to address the growing energy demands and logistical challenges of terrestrial AI infrastructure while potentially revolutionizing how we approach large-scale machine learning computation.
Source: 9to5Google
The project envisions compact constellations of approximately 80 satellites positioned in dawn-dusk sun-synchronous low-earth orbit, roughly 400 miles above Earth's surface
4
. This orbital configuration ensures near-constant sunlight exposure, hence the "Suncatcher" name, providing a significant advantage over terrestrial solar installations.
Source: PC Magazine
The space-based approach offers compelling energy benefits that could transform AI computing economics. Solar panels in orbit demonstrate up to eight times greater efficiency compared to Earth-based installations
3
. This dramatic improvement stems from the absence of atmospheric interference and the availability of uninterrupted sunlight, eliminating the need for extensive battery systems or alternative power generation methods.Google's analysis suggests that by the mid-2030s, when launch costs are projected to fall to approximately $200 per kilogram, the operational expenses of space-based data centers could become comparable to terrestrial facilities on a per-kilowatt/year basis
2
. This cost parity calculation factors in the substantial energy savings and reduced infrastructure requirements of orbital installations.The technical architecture of Project Suncatcher relies on sophisticated free-space optical links to maintain high-speed communication between satellites
1
. To match the performance of terrestrial data centers, these wireless connections must operate at tens of terabits per second, with early Earth-based testing achieving bidirectional speeds up to 1.6 Tbps.The satellites must maintain extremely tight formations, positioned just hundreds of meters apart, requiring only "modest station-keeping maneuvers" according to Google's analytical models
3
. This proximity is essential because received power decreases with the square of distance, necessitating closer spacing than any currently operational satellite constellation.A critical challenge involves ensuring TPU hardware can survive the harsh space environment for at least five years, equivalent to approximately 750 rad of radiation exposure
1
. Google has conducted extensive radiation testing using 67MeV proton beams on its latest v6e Cloud TPU (Trillium) chips, with promising results.
Source: Economic Times
The testing revealed that while High Bandwidth Memory (HBM) subsystems proved most vulnerable to radiation damage, they only showed irregularities after cumulative doses of 2 krad(Si) - nearly three times the expected five-year mission dose
3
. No hard failures occurred up to the maximum tested dose of 15 krad(Si), indicating surprising radiation hardness for space applications.Related Stories
Google plans to launch two prototype satellites by early 2027 in partnership with satellite imaging company Planet Labs
2
. These initial missions will test TPU hardware performance in orbit and validate optical inter-satellite links for distributed machine learning tasks.The company acknowledges significant engineering challenges remain, including thermal management in the vacuum of space, high-bandwidth ground communications, and ensuring on-orbit system reliability
4
. Google's approach involves reusing terrestrial components rather than expensive space-hardened alternatives, following successful examples like the Snapdragon-powered Mars Ingenuity helicopter.Project Suncatcher enters a growing field of space-based computing initiatives. SpaceX CEO Elon Musk recently announced similar ambitions, leveraging his company's Starlink satellite infrastructure
2
. Additionally, startup Starcloud successfully launched its first test satellite equipped with Nvidia AI GPUs, demonstrating industry-wide interest in orbital computing solutions.Summarized by
Navi
[4]
22 Oct 2025•Technology

31 Oct 2025•Technology

24 May 2025•Technology

1
Business and Economy

2
Business and Economy

3
Technology
