2 Sources
2 Sources
[1]
AI creates the first 100-billion-star Milky Way simulation
Researchers led by Keiya Hirashima at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan, working with partners from The University of Tokyo and Universitat de Barcelona in Spain, have created the first Milky Way simulation capable of tracking more than 100 billion individual stars across 10 thousand years of evolution. The team achieved this milestone by pairing artificial intelligence (AI) with advanced numerical simulation techniques. Their model includes 100 times more stars than the most sophisticated earlier simulations and was generated more than 100 times faster. The work, presented at the international supercomputing conference SC '25, marks a major step forward for astrophysics, high-performance computing, and AI-assisted modeling. The same strategy could also be applied to large-scale Earth system studies, including climate and weather research. Why Modeling Every Star Is So Difficult For many years, astrophysicists have aimed to build Milky Way simulations detailed enough to follow each individual star. Such models would allow researchers to compare theories of galactic evolution, structure, and star formation directly to observational data. However, simulating a galaxy accurately requires calculating gravity, fluid behavior, chemical element formation, and supernova activity across enormous ranges of time and space, which makes the task extremely demanding. Scientists have not previously been able to model a galaxy as large as the Milky Way while maintaining fine detail at the level of single stars. Current cutting-edge simulations can represent systems with the equivalent mass of about one billion suns, far below the more than 100 billion stars that make up the Milky Way. As a result, the smallest "particle" in those models usually represents a group of roughly 100 stars, which averages away the behavior of individual stars and limits the accuracy of small-scale processes. The challenge is tied to the interval between computational steps: to capture rapid events such as supernova evolution, the simulation must advance in very small time increments. Shrinking the timestep means dramatically greater computational effort. Even with today's best physics-based models, simulating the Milky Way star by star would require about 315 hours for every 1 million years of galactic evolution. At that rate, generating 1 billion years of activity would take over 36 years of real time. Simply adding more supercomputer cores is not a practical solution, as energy use becomes excessive and efficiency drops as more cores are added. A New Deep Learning Approach To overcome these barriers, Hirashima and his team designed a method that blends a deep learning surrogate model with standard physical simulations. The surrogate was trained using high-resolution supernova simulations and learned to predict how gas spreads during the 100,000 years following a supernova explosion without requiring additional resources from the main simulation. This AI component allowed the researchers to capture the galaxy's overall behavior while still modeling small-scale events, including the fine details of individual supernovae. The team validated the approach by comparing its results against large-scale runs on RIKEN's Fugaku supercomputer and The University of Tokyo's Miyabi Supercomputer System. The method offers true individual-star resolution for galaxies with more than 100 billion stars, and it does so with remarkable speed. Simulating 1 million years took just 2.78 hours, meaning that 1 billion years could be completed in approximately 115 days instead of 36 years. Broader Potential for Climate, Weather, and Ocean Modeling This hybrid AI approach could reshape many areas of computational science that require linking small-scale physics with large-scale behavior. Fields such as meteorology, oceanography, and climate modeling face similar challenges and could benefit from tools that accelerate complex, multi-scale simulations. "I believe that integrating AI with high-performance computing marks a fundamental shift in how we tackle multi-scale, multi-physics problems across the computational sciences," says Hirashima. "This achievement also shows that AI-accelerated simulations can move beyond pattern recognition to become a genuine tool for scientific discovery -- helping us trace how the elements that formed life itself emerged within our galaxy."
[2]
Japan researchers simulate Milky Way with 100 billion stars using AI
The model uses an AI surrogate trained on high resolution supernova data to replace millions of tiny physics steps. Researchers in Japan have developed the first simulation of the Milky Way galaxy that tracks more than 100 billion individual stars by combining artificial intelligence with supercomputing capabilities. Presented at the SC '25 supercomputing conference in St. Louis, the model simulates 10,000 years of galactic evolution and operates 100 times faster than prior techniques to address computational limitations in modeling large-scale cosmic structures. Prior simulations reached the state of the art by managing galaxies with stellar masses equivalent to about one billion suns, which represents only one-hundredth of the Milky Way's actual stellar population. These efforts relied on conventional physics-based methods that demand extensive processing power. For instance, such approaches require 315 hours to compute one million years of galactic evolution. Extending this to a billion-year timeframe would necessitate more than 36 years of continuous computation, rendering full-scale Milky Way simulations impractical for most research timelines. The advancement stems from a deep learning surrogate model trained using high-resolution simulations of supernova events. This artificial intelligence element acquires the ability to forecast the expansion of gas over the 100,000 years after a supernova detonation. By doing so, it eliminates the need for numerous small, resource-intensive timesteps in the overall simulation process while preserving the precision of physical outcomes. The research team deployed this system across 7 million CPU cores, utilizing RIKEN's Fugaku supercomputer alongside the University of Tokyo's Miyabi system, to achieve these efficiencies. With this setup, the simulation time dropped to 2.78 hours for each million years of evolution, allowing a projection spanning one billion years to complete in roughly 115 days. Hirashima stated, "Integrating AI with high-performance computing marks a fundamental shift in how we tackle multi-scale, multi-physics problems across the computational sciences." He highlighted potential uses in climate modeling, weather prediction, and oceanography, where challenges arise from connecting small-scale phenomena to broader system dynamics. The resulting simulation permits scientists to follow the emergence of elements vital for life throughout the galaxy's history. This capability provides insights into the chemical evolution processes that contributed to the formation of planets resembling Earth.
Share
Share
Copy Link
Scientists at RIKEN have developed the first simulation capable of tracking over 100 billion individual stars in the Milky Way using AI-accelerated computing. The breakthrough combines deep learning with supercomputing to achieve results 100 times faster than traditional methods.

Researchers at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan have achieved a groundbreaking milestone in astrophysics by creating the first simulation capable of tracking more than 100 billion individual stars across the Milky Way galaxy. Led by Keiya Hirashima and working with partners from The University of Tokyo and Universitat de Barcelona, the team presented their work at the international supercomputing conference SC '25, demonstrating a model that includes 100 times more stars than previous simulations while operating more than 100 times faster
1
.The simulation tracks 10,000 years of galactic evolution and represents a major advancement for astrophysics, high-performance computing, and AI-assisted modeling. This breakthrough addresses a long-standing challenge in computational astrophysics where scientists have struggled to model galaxies as large as the Milky Way while maintaining fine detail at the individual star level
2
.Previous state-of-the-art simulations could only represent systems with stellar masses equivalent to about one billion suns, far below the Milky Way's actual population of more than 100 billion stars. These limitations forced researchers to use computational "particles" that represented groups of roughly 100 stars, averaging away individual stellar behavior and limiting the accuracy of small-scale processes
1
.The computational challenge stems from the need to capture rapid events such as supernova evolution, which requires extremely small time increments. Traditional physics-based models would require approximately 315 hours to simulate every million years of galactic evolution. At this rate, generating one billion years of galactic activity would take over 36 years of real-time computation, making full-scale Milky Way simulations impractical for most research timelines
2
.To overcome these barriers, Hirashima's team developed an innovative hybrid approach that combines deep learning surrogate models with standard physical simulations. The AI component was trained using high-resolution supernova simulations and learned to predict how gas spreads during the 100,000 years following a supernova explosion without requiring additional computational resources from the main simulation
1
.This artificial intelligence element eliminates the need for numerous small, resource-intensive timesteps while preserving the precision of physical outcomes. The research team deployed this system across 7 million CPU cores, utilizing RIKEN's Fugaku supercomputer alongside the University of Tokyo's Miyabi Supercomputer System to achieve these remarkable efficiencies
2
.With this revolutionary setup, simulation time dropped dramatically to just 2.78 hours for each million years of evolution. This means that projections spanning one billion years can now be completed in approximately 115 days instead of the previously required 36 years
1
.Related Stories
The implications of this breakthrough extend far beyond astrophysics. This hybrid AI approach could reshape many areas of computational science that require linking small-scale physics with large-scale behavior. Fields such as meteorology, oceanography, and climate modeling face similar multi-scale challenges and could benefit significantly from tools that accelerate complex simulations
1
.The resulting simulation enables scientists to follow the emergence of elements vital for life throughout the galaxy's history, providing unprecedented insights into the chemical evolution processes that contributed to the formation of Earth-like planets. As Hirashima noted, this achievement demonstrates that AI-accelerated simulations can move beyond pattern recognition to become genuine tools for scientific discovery, helping researchers trace how the elements that formed life itself emerged within our galaxy
2
.Summarized by
Navi
[1]
17 Apr 2025β’Science and Research

10 Oct 2024β’Science and Research

11 Jan 2025β’Science and Research

1
Technology

2
Technology

3
Business and Economy
