Future challenges within HPC

'How do we manage the rising tide of scientific data?' This question, posed by Professor Dr Achim Bachem, chairman of the Jülich Research Center, is critical to science and engineering, as the volume of data being generated increases at a far greater rate than that of computing capabilities.

Speaking at the opening of the Binnig and Rohrer Nanotechnology Center in Zürich, Bachem said that storage and retrieval of data represent the biggest challenges in computing today. He believes that in moving towards exascale – a milestone he says we will indeed reach within the next 10 years – the main issue will not be the technology, but rather the question of what we will do with the exabytes of data generated as a result.

The Square Kilometre Array (SKA) radio telescope, for example, will push the limits of the observable universe and generate one Pbyte of data every 20 seconds – compare that to the Large Hadron Collider's 12 Pbytes per year and the extent of the problem becomes evident. As Bachem put it: 'Unless we focus on data management, we will only create one graveyard after another for scientific data.'

This problem, and how it is solved, will determine the advances that scientists will be able to make by using next-generation supercomputers. But that is not the only change. Speaking to Scientific Computing World, Alessandro Curioni, manager, computational science at IBM Research - Zurich, commented on the prevalence of high-performance-based simulation within research projects: 'We don't have any project, even at the lower end of the scale, in which simulation doesn't make a difference. Simulation is the third pillar of the scientific enterprise, together with theory and experimentation and the important thing is that these three aspects are all on the same level. That is true today, but it was not true even 15 years ago.' He went on to state, however, that innovation does not come from simulation alone.

Bachem described simulation as the 'next-generation microscope' and said that theory predicts phenomena, experimentation verifies whether the phenomena really exists, and now simulation allows us to get to the details of how and why. In early May 2011, and after 50 years of experimentation, Professor Dr Ulf-G. Meissner of the Helmholtz-Institut fur Strahlen- und Kernphysik of Bonn University was able to show that the Hoyle State can be computed. The Hoyle state is an excited form of carbon-12 with precisely the properties necessary to allow just the right amount of carbon to be created inside a star. Understanding what the stellar system does during fusion, i.e. precisely how carbon is generated from helium atoms, tells researchers why there is such an abundance of carbon on Earth. Simulation has become the new instrument for discovery.

But beyond simulation is the drive towards collaboration – another key theme within discussions. During his talk, Bachem presented the idea of the community-orientated simulation laboratory where a core group from the domain discipline – biologists, physicists, engineers, etc. – work together with mathematicians on whatever problem they are investigating. One point that was emphasised is that those who focus on the theory have to learn about the architecture of the computer. Physicists currently build the models, but then someone else programs the algorithm to run on the supercomputer. The mathematical model will try to fit the original model, but this brings its own difficulties. The solution is to take a truly interdisciplinary approach that embraces simulation science.


Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Analysis and opinion

Robert Roe looks at research from the University of Alaska that is using HPC to change the way we look at the movement of ice sheets


Robert Roe talks to cooling experts to find out what innovation lies ahead for HPC users

Analysis and opinion