2 Sources
2 Sources
[1]
AI galaxy hunters are adding to the global GPU crunch | TechCrunch
NASA announced that it will launch the Nancy Grace Roman space telescope into orbit in September 2026, eight months ahead of schedule. The new space telescope is expected to deliver 20,000 terabytes of data to astronomers over the course of its life. That will add to 57 gigabytes of breath-taking imagery downlinked daily from the James Webb Space Telescope, which began its work in 2021, and the start of a survey later this year by the Vera C. Rubin Observatory in the mountains of Chile, which is expected to gather 20 terabytes of data each night. For comparison, the Hubble Space telescope, once the gold standard, delivers just 1 to 2 gigabytes of sensor readings each day. It's been a while since all those readings were pored over by hand, but like everyone else with a pile of data, astronomers are now turning to GPUs to solve their problems. Brant Robertson, a UC Santa Cruz astrophysicist, has had a front-row seat to this step change in science while supporting or using data from these missions. Robertson has spent the past 15 years working with Nvidia to apply GPUs to the problems of understanding space, first through advanced simulations testing theories about supernnova explosions, and now developing the tools to analyze a torrent of data from the newest observatories. "There's been this evolution [from] looking at a few objects, to doing CPU-based analyses on large scales of the data set, to then doing GPU-accelerated versions of those same analyses," he told TechCrunch. Robertson and then-graduate student Ryan Hausen developed a deep learning model called Morpheus that can pore over large data sets and identify galaxies. Their early AI analysis of Webb data identified a surprising number of a specific type of disc galaxies and added a new wrinkle to theories about the development of our universe. Now Morpheus is changing with the times: Robertson is switching its architecture from convolutional neural networks to the transformers behind the rise of large language models. That will result in the model being able to analyze several times the area than it can currently, speeding up its work. Robertson is also working on generative AI models trained on space telescope data to improve the quality of observations collected by ground telescopes, which are distorted by Earth's atmosphere. Despite advances in rocketry, it's still hard to get an 8 meter mirror into orbit, so using software to improve Rubin's observations is the next best thing. But he's still feeling the pressure of global demand for GPU access. Robertson has used the National Science Foundation to build a GPU cluster at UC Santa Cruz, but it is becoming outdated even as more researchers want to apply compute-intensive techniques to their work. The Trump administration proposed cutting the NSF's budget by 50% in its current budget request. "People want to do these AI, ML analyses, and GPUs are really the way to do that," Robertson said. "You have to be entrepreneurial...especially when you're working kind of at the edge of where the technology is. Universities are very risk averse because they just have constrained resources, so you have to go out and show them that, 'look, this is where we're going as a field.'"
[2]
Making Sense of the Early Universe
This Spring Astronomy Day, here's a look at how AI and GPUs are helping astronomers work through unprecedented volumes of cosmic data. There are more galaxies in the universe than anyone ever expected. Unfortunately, they all showed up at once. When the first images from the James Webb Space Telescope (JWST) began returning data in 2022, Brant Robertson and his colleagues did what astronomers have always done: They stared at the sky and tried to understand what they were seeing. This time, the sky arrived as terabytes. "There were galaxies everywhere," Robertson recalled. "So many, and so far away, that we were genuinely shocked." Robertson is a professor of astronomy and astrophysics at the University of California, Santa Cruz, where he leads a team studying how the earliest galaxies formed after the Big Bang. It's the kind of work Spring Astronomy Day was made for -- and thanks to the datasets his team releases publicly, anyone marking the occasion this year can explore the early universe in more depth than was possible even a few years ago. Over the past several years, his group has broken the record for the most distant known galaxy more than once, each time pushing observation closer to the universe's first light. Without computation at this scale, the data would just pile up. Observational limits require calculation. Copernicus used mathematics to resolve observational inconsistencies. Robertson does the same using computational models. JWST is the most powerful observatory ever launched, observing in infrared, capturing light that has traveled for more than 13 billion years. Each deep-field image is crowded with hundreds of thousands of galaxies, some of them 13 billion years old. That abundance is the problem. "These datasets are far too large and complex for humans to analyze by hand," Robertson said. "Even teams of experts would take years to do what now needs to happen in days."
Share
Share
Copy Link
NASA's upcoming Nancy Grace Roman telescope will generate 20,000 terabytes of data, adding to the James Webb Space Telescope's daily 57 gigabytes. Astronomers like UC Santa Cruz's Brant Robertson are turning to AI and GPUs to analyze this flood of cosmic information, but they're now competing for scarce computing resources in an already strained market.

The study of galaxies has entered a new era of data abundance that's creating unexpected pressure on GPU availability. NASA announced the Nancy Grace Roman space telescope will launch in September 2026, eight months ahead of schedule, and is expected to deliver 20,000 terabytes of data over its lifetime
1
. This astronomical influx adds to the James Webb Space Telescope's daily downlink of 57 gigabytes of imagery since 2021, and the Vera C. Rubin Observatory's anticipated 20 terabytes of data each night starting later this year1
. By comparison, the Hubble Space Telescope delivers just 1 to 2 gigabytes daily, highlighting how dramatically the volume of cosmic data has expanded1
.Brant Robertson, a UC Santa Cruz astrophysicist, has witnessed this transformation firsthand while working with data from these missions. Over 15 years collaborating with Nvidia, Robertson has applied GPU technology first to advanced simulations testing theories about supernova explosions, and now to developing tools for analyzing the torrent of space telescope data . Robertson and then-graduate student Ryan Hausen developed Morpheus, a deep learning model that can examine large data sets and identify galaxies automatically . Their early AI analysis of Webb data identified a surprising number of specific disc galaxies, adding new insights to theories about the formation of early galaxies and the development of the universe .
Morpheus itself is adapting to modern AI techniques. Robertson is switching its architecture from convolutional neural networks to transformer architecture, the same technology behind large language models . This upgrade will enable the model to analyze several times more area than currently possible, significantly accelerating its work. Robertson is also developing generative AI models trained on space telescope data to enhance observations from ground-based telescopes, which suffer from atmospheric distortion . Despite advances in rocketry making it difficult to launch 8-meter mirrors into orbit, using software to improve Rubin's observations offers a practical alternative.
Related Stories
The challenge astronomers now face extends beyond software development. Robertson is experiencing the pressure of global demand for GPU access firsthand . While he used National Science Foundation funding to build a GPU cluster at UC Santa Cruz, it's becoming outdated even as more researchers seek to apply compute-intensive techniques to their work. The situation is compounded by the Trump administration's proposal to cut the NSF's budget by 50% in its current budget request . "People want to do these AI, ML analyses, and GPUs are really the way to do that," Robertson said, noting that researchers must be entrepreneurial when working at the edge of technology, especially as universities remain risk-averse due to constrained resources .
When the first images from the James Webb Space Telescope began returning in 2022, Robertson and his colleagues encountered an overwhelming sight. "There were galaxies everywhere," Robertson recalled, describing how the abundance of distant galaxies genuinely shocked the team
2
. Robertson leads a team at UC Santa Cruz studying how the earliest galaxies formed after the Big Bang, work that has broken the record for the most distant known galaxy multiple times, each time pushing observation closer to the universe's first light2
. JWST, the most powerful observatory ever launched, observes in infrared and captures light that has traveled for more than 13 billion years, with each deep-field image crowded with hundreds of thousands of galaxies2
. "These datasets are far too large and complex for humans to analyze by hand," Robertson explained, noting that even teams of experts would take years to accomplish what now needs to happen in days2
. Without computation at this scale, the cosmic data would simply pile up, making AI and GPU technology essential for modern astrophysics and understanding the universe.Summarized by
Navi
[2]
27 Jan 2026•Science and Research

17 Nov 2025•Science and Research

21 Mar 2025•Science and Research

1
Technology

2
Science and Research

3
Technology
