Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

US researchers link supercomputers to power cosmic simulation

Share this on social media:

Researchers at the US Department of Energy’s (DOE) Argonne National Laboratory have increased their capacity for cosmological simulation by opening up a link to another research centre - the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UI).

This new approach links supercomputers at the Argonne Leadership Computing Facility (ALCF) and at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UI) to enable computationally demanding and highly intricate simulations of how the cosmos evolved after the Big Bang.

‘You need high-performance supercomputers that are capable of not only capturing the dynamics of trillions of different particles but also doing exhaustive analysis on the simulated data,’ said Argonne cosmologist Katrin Heitmann. ‘Sometimes, it’s advantageous to run the simulation and do the analysis on different machines.’

This link enabled scientists to transfer massive amounts of data and to run two different types of demanding computations in a coordinated fashion – referred to technically as a workflow.

Argonne transferred data produced as part of the simulation directly to the Blue Waters system for analysis. This is no small feat as the researchers aimed to establish sustained bandwidth that would allow them to transfer up to one petabyte per day.

While similar simulations have been conducted previously what separates this work from previous studies is the scale of the computation, the associated data generation and transfer, and the size and complexity of the final analysis. Researchers also tapped the unique capabilities of each supercomputer: They performed cosmological simulations on the ALCF’s Mira supercomputer and then sent huge quantities of data to UI’s Blue Waters, which is better suited to perform the required data analysis tasks because of its processing power and memory balance.

Typically, cosmological simulations can only output a fraction of data generated in such experiments because of data storage limitations. However, with this new approach, Argonne sent every data frame to NCSA as soon it was generated, allowing Heitmann and her team to reduce the storage demands on the ALCF file system.

The full article relating to this research project – written by Jared Sagoff and Austin Keating – can be found on the Argonne National Laboratory website.