Space News & Blog Articles

Tune into the SpaceZE News Network to stay updated on industry news from around the world.

A new Simulation of the Universe Contains 60 Trillion Particles, the Most Ever

Today, the greatest mysteries facing astronomers and cosmologists are the roles gravitational attraction and cosmic expansion play in the evolution of the Universe. To resolve these mysteries, astronomers and cosmologists are taking a two-pronged approach. These consist of directly observing the cosmos to observe these forces at work while attempting to find theoretical resolutions for observed behaviors – such as Dark Matter and Dark Energy.

In between these two approaches, scientists model cosmic evolution with computer simulations to see if observations align with theoretical predictions. The latest of which is AbacusSummit, a simulation suite created by the Flatiron Institute’s Center for Computational Astrophysics (CCA) and the Harvard-Smithsonian Center for Astrophysics (CfA). Capable of processing nearly 60 trillion particles, this suite is the largest cosmological simulation ever produced.

The creators of AbacusSummit announced the simulation suite in a series of papers that appeared in the Monthly Notices of the Royal Astronomical Society (MNRAS). Made up of more than 160 simulations, it models how particles behave in a box-shaped environment due to gravitational attraction. These models are known as N-body simulations and are intrinsic to modeling how dark matter interacts with baryonic (aka. “visible”) matter.

The simulated distribution of dark matter in galaxies. Credit: Brinckmann et al.

The development of the AbacusSummit simulation suite was led by Lehman Garrison (a CCA research fellow) and Nina Maksimova and Daniel Eisenstein, a graduate student and professor of astronomy with the CfA (respectively). The simulations were run on the Summit supercomputer at the Oak Ridge Leadership Computing Facility (ORLCF) in Tennessee – overseen by the U.S. Department of Energy (DoE).

N-body calculations, which consist of computing the gravitational interaction of planets and other objects, are among the greatest challenges facing astrophysicists today. Part of what makes it daunting is that each object interacts with every other object, regardless of how far apart they are – the more objects under study, the more interactions that need to be accounted for.

To date, there is still no solution for N-body problems where three or more massive bodies are involved, and the calculations available are mere approximations. For example, the mathematics for calculating the interaction of three bodies, such as a binary star system and a planet (known as the “Three-Body Problem”), is yet to be resolved. A common approach with cosmological simulations is stopping the clock, calculating the total force acting on each object, moving time ahead slowly, and repeating.

For the sake of their research (which was led by Maksimova), the team designed their codebase (called Abacus) to take advantage of Summit’s parallel processing power – whereby multiple calculations can run simultaneously. They also relied on machine learning algorithms and a new numerical method, which allowed them to calculate 70 million particles per node/s at early times and 45 million particle updates per node/s at late times.

A snapshot of one of the AbacusSummit simulations, shown at various zoom scales: 10 billion light-years across, 1.2 billion light-years across, and 100 million light-years across. Credit: The AbacusSummit Team/ layout by Lucy Reading-Ikkanda/Simons Foundation

As Garrison explained in a recent CCA press release:

“This suite is so big that it probably has more particles than all the other N-body simulations that have ever been run combined – though that’s a hard statement to be certain of. The galaxy surveys are delivering tremendously detailed maps of the Universe, and we need similarly ambitious simulations that cover a wide range of possible universes that we might live in.

“AbacusSummit is the first suite of such simulations that has the breadth and fidelity to compare to these amazing observations… Our vision was to create this code to deliver the simulations that are needed for this particular new brand of galaxy survey. We wrote the code to do the simulations much faster and much more accurately than ever before.”

In addition to the usual challenges, running full simulations of N-body calculations requires that algorithms be carefully designed because of all the memory storage involved. This means that Abacus couldn’t make copies of the simulation for different supercomputer nodes to work on and divided each simulation into a grid instead. This consists of making approximate calculations for distant particles, which play a smaller role than nearby ones.

It then splits off the nearby particles into multiple cells so the computer can work on each independently, then combines the results of each with the approximation of distant particles. The research team found that this approach (uniform divisions) makes better use of parallel processing and allows a large amount of the distant-particle approximation to be computed before the simulation starts.

Abacus’ parallel computer processing, visualized. Credit: Lucy Reading-Ikkanda/Simons Foundation

This is a significant improvement to other N-body codebases, which irregularly divide simulations based on the distribution of particles. Thanks to its design, Abacus can update 70 million particles per node/second (where each particle represents a clump of Dark Matter with three billion solar masses). It can also analyze the simulation as it’s running and search for patches of Dark Matter that indicate the presence of bright star-forming galaxies.

These and other cosmological objects will be the subject of future surveys that map the cosmos in unprecedented detail. These include the Dark Energy Spectroscopic Instrument (DESI), the Nancy Grace Roman Space Telescope (RST), and the ESA’s Euclid spacecraft. One of the goals of these big-budget missions is to improve estimations of the cosmic and astrophysical parameters that determine how the Universe behaves and how it looks.

This, in turn, will allow for more detailed simulations that employ updated values for various parameters, such as Dark Energy. Daniel J. Eisenstein, a researcher with the CfA and a co-author on the paper, is also a member of the DESI collaboration. He and others like him are looking forward to what Abacus can do for these cosmological surveys in the coming years.

“Cosmology is leaping forward because of the multidisciplinary fusion of spectacular observations and state-of-the-art computing,” he said. “The coming decade promises to be a marvelous age in our study of the historical sweep of the universe.”

Further Reading: Simons Foundation, MNRAS

×
Stay Informed

When you subscribe to the SpaceZE News Feed, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.

Testing mini-radar to peer inside asteroid
Hubble Space Telescope team continues troubleshoot...

SpaceZE.com