Researchers at the RIKEN Center for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan have achieved a significant breakthrough in astrophysics by creating a hyper-realistic simulation of the Milky Way. Collaborating with colleagues from the University of Tokyo and the Universitat de Barcelona, the team successfully modeled over 100 billion stars over a timespan of 10,000 years, marking the first time such a comprehensive simulation has been realized.
The new model not only represents an impressive increase in the number of stars simulated—100 times more than previous efforts—but it was also produced at a speed that is 100 times faster. This leap in capability was made possible by utilizing 7 million CPU cores, advanced machine learning algorithms, and sophisticated numerical simulations. The research findings were published in a paper titled “The First Star-by-star N-body/Hydrodynamics Simulation of Our Galaxy Coupling with a Surrogate Model,” featured in the *Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis* (SC ’25).
Advancements in Galactic Simulation
Simulating the dynamics of the Milky Way at the level of individual stars allows astronomers to test and refine theories on galactic formation, structure, and evolution. Historically, creating such detailed models has posed significant challenges. Researchers have long grappled with accurately capturing the myriad forces involved, including gravity, fluid dynamics, supernovae, element synthesis, and the influences of supermassive black holes (SMBHs). Until now, the computational power required limited the mass of galaxies that could be modeled effectively to around one billion solar masses, which accounts for less than 1% of the Milky Way’s stellar content.
Traditionally, high-performance supercomputing systems could take approximately 315 hours—over 13 days—to simulate just 1 million years of galactic evolution. To simulate 1 billion years, it would take over 36 years. This time-intensive process restricts researchers to observing only large-scale events, as merely adding more supercomputer cores does not resolve the inherent inefficiencies.
To overcome these obstacles, the team, led by researcher Hirashima, integrated a machine learning surrogate model that operates independently of the primary model’s resources. This innovative approach was trained on high-resolution simulations of supernovae, enabling it to predict the impact of these stellar explosions on nearby gas and dust over a timeframe of 100,000 years following the event. By merging this AI-driven model with physical simulations, the team effectively captured both the dynamics of a Milky Way-sized galaxy and intricate stellar phenomena.
Results and Future Implications
The researchers validated their simulation’s performance through extensive tests conducted on the Fugaku and Miyabi Supercomputer Systems. Results demonstrated that their new method could simulate the evolution of galaxies with over 100 billion stars in a fraction of the time previously required, achieving a full 1 million-year simulation in just 2.78 hours. At this rate, the entirety of the Milky Way’s history—estimated at 13.61 billion years—could be modeled in approximately 115 days.
These advancements offer astronomers an invaluable tool for testing theories around galactic evolution and provide insights into the formation of the Universe. The application of machine learning in such complex simulations signifies a promising frontier not only for astrophysics but also for other fields that require the integration of large and small-scale factors, such as meteorology, ocean dynamics, and climate science.
With the ability to conduct simulations at unprecedented levels of detail and efficiency, the implications of this research extend far beyond the Milky Way, paving the way for a deeper understanding of the cosmos.
