Skip to main content

Simulations improve characterisation of dark matter

Ohio State University (OSU) researchers are using powerful supercomputers to investigate one of the key observational probes of dark energy, the mysterious energy form that is causing the expansion of the universe to accelerate over time. The OSU project, led by Chris Orban, a graduate research fellow in physics at Ohio State’s Center for Cosmology and Astro-Particle Physics, focuses on simulations created on Ohio Supercomputer Center (OSC) systems to simplify and better characterise a subtle dark matter clustering feature. The new model allows cosmologists to gain a more accurate understanding of certain aspects of large-scale structure, such as the effect of the expansion of the universe on the growth of density fluctuations.

‘Knowing how the dark matter “reacts” to the expansion of the universe is crucial for learning the most about dark energy and dark matter from large astronomical surveys like the Sloan Digital Sky Survey, of which OSU is a collaborating member,’ said Orban. ‘In particular, there is a subtle clustering feature seen in this data set called Baryon Acoustic Oscillations (BAO), which turns out to be very useful for constraining cosmological parameters like the equation of state of dark energy.’

The oscillations come from fluctuations in the distribution of hot plasma in the early universe; researchers can identify this feature by measuring the cosmic microwave background. ‘The BAO signature gets imprinted on the dark matter very early on, but the feature changes over cosmic time, potentially biasing its use as a cosmological tool,’ Orban explained. ‘It’s a complicated non-linear problem, and physicists are very fond of simplifying complicated problems to gain a more in-depth understanding. This is exactly what we did for the first time, in our paper, using N-body simulations.’

Since early 2009, Orban and his academic advisor, David Weinberg, who is a professor of astronomy at OSU and the project scientist for the Sloan Survey, have employed nearly 200,000 processor-hours of computational time on the OSC’s flagship Glenn Cluster and eight terabytes of storage space on its Mass Storage Environment. The Glenn Cluster offers researchers more than 9,600 Opteron cores, 24 terabytes of memory and a peak computational capability of 75 teraflops – which translates to 75 trillion calculations per second.

For software, the researchers employed the Gadget-2 N-body code to calculate the trajectories of more than a hundred million particles, and set the initial conditions using the 2LPT code developed by their collaborators at New York University. Orban and Weinberg authored a paper on this research, ‘Self-similar Bumps and Wiggles: Isolating the Evolution of the BAO Peak with Power-law Initial Conditions,’ which is due for publication in the journal Physical Review D.

Media Partners