NEWS
Tags: 

US researchers link supercomputers to power cosmic simulation

Researchers at the US Department of Energy’s (DOE) Argonne National Laboratory have increased their capacity for cosmological simulation by opening up a link to another research centre - the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UI).

This new approach links supercomputers at the Argonne Leadership Computing Facility (ALCF) and at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UI) to enable computationally demanding and highly intricate simulations of how the cosmos evolved after the Big Bang.

‘You need high-performance supercomputers that are capable of not only capturing the dynamics of trillions of different particles but also doing exhaustive analysis on the simulated data,’ said Argonne cosmologist Katrin Heitmann. ‘Sometimes, it’s advantageous to run the simulation and do the analysis on different machines.’

This link enabled scientists to transfer massive amounts of data and to run two different types of demanding computations in a coordinated fashion – referred to technically as a workflow.

Argonne transferred data produced as part of the simulation directly to the Blue Waters system for analysis. This is no small feat as the researchers aimed to establish sustained bandwidth that would allow them to transfer up to one petabyte per day.

While similar simulations have been conducted previously what separates this work from previous studies is the scale of the computation, the associated data generation and transfer, and the size and complexity of the final analysis. Researchers also tapped the unique capabilities of each supercomputer: They performed cosmological simulations on the ALCF’s Mira supercomputer and then sent huge quantities of data to UI’s Blue Waters, which is better suited to perform the required data analysis tasks because of its processing power and memory balance.

Typically, cosmological simulations can only output a fraction of data generated in such experiments because of data storage limitations. However, with this new approach, Argonne sent every data frame to NCSA as soon it was generated, allowing Heitmann and her team to reduce the storage demands on the ALCF file system.

The full article relating to this research project – written by Jared Sagoff and Austin Keating – can be found on the Argonne National Laboratory website. 

Other tags: 
Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Feature

Robert Roe explores the role of maintenance in ensuring HPC systems run at optimal performance

Feature

Robert Roe speaks with Dr Maria Girone, Chief Technology Officer at CERN openlab.

Feature

Dr Keren Bergman, Professor of Electrical Engineering at the School of Engineering and Applied Science, Columbia University discusses her keynote on the development of silicon photonics for HPC ahead of her keynote presentation at ISC High Performance 2018 

Feature

Sophia Ktori explores the use of informatics software in the first of two articles covering the use of laboratory informatics software in regulated industries

Feature

Robert Roe discusses the role of the Pistoia Alliance in creating the lab of the future with Pistoia’s Nick Lynch