Skip to main content

Argonne models evolution of universe

Modelling the evolution of the universe is no mean feat, not only because of the complex mathematics involved, but also because of the sheer amount of data that is generated from a working model of the universe.

Dealing with data, however, is the specialty of a group of scientists at the US Department of Energy's (DOE) Argonne National Laboratory. To more easily share and analyse the mountains of data from today's scientific challenges, they are developing software that enables researchers to interact with their results in real-time from across the country.

The rise of supercomputers has given a powerful tool to scientists, who use these machines to model complex questions. As the compute power of such resources increases, so does the complexity of the questions that can be asked - and the size of the data sets that are generated.

'Finding the resources and software capable of rendering volumes of data at such large scales can be a challenge,' said Mark Hereld, visualisation and analysis lead for Argonne's Leadership Computing Facility (ALCF). Fortunately, the ALCF is home to Eureka, one of the world's largest graphics supercomputers, which features 200 high-end GPUs. Eureka enables software such as vl3, a volume rendering toolkit developed at Argonne and the University of Chicago, that leverages graphics hardware to visualise such data sets in real time.

Networking advances make it feasible to move large amounts of data from the location where it was computed to specialised visualisation resources where it can be rendered into images.  However, the scientists who need to analyse this data often live and work far from both supercomputing and rendering clusters. It is vital that the renderings be brought to the scientist.

To see the subtle details in the data and make full use of the visualisations, high-quality images are also required. New vl3 enhancements allow researchers to stream hi-res images created on graphics clusters to a remote cluster driving a high-resolution tiled display.

The next step will be to add controls already present in the local version of the software to the wide-area version, giving the scientist even more power to investigate his or her data.

The simulation was done as part of a 2009 TeraGrid resource allocation entitled 'Projects in Astrophysical and Cosmological Structure Formation', designed to simulate the cosmic structures of the early universe by calculating the gravitational clumping of intergalactic gas and dark matter. The model uses a computational grid made up of 40003 cells, contained 64 billion dark matter particles and took over four million CPU hours to complete.

Media Partners