Skip to main content

Scientific supercomputing in Germany

Germany has increased its presence on the global supercomputing stage, having unveiled one of the world’s largest high-performance computing sites in the last few months and dedicating much of the country’s HPC power to scientific research.

Germany currently has the third largest country share (6.2 per cent) of the world’s Top500 supercomputing sites, only to be outdone by the US with a 56.6 per cent share and the UK with 9.6 per cent. And this figure has steadily risen over the past couple of years, with Germany also maintaining a ‘unique’ and supportive environment for the scientists using its HPC resources, according to the researchers using such places.

Many of the German HPC centres are being used for scientific research, with a supercomputer known as Jugene, which is an IBM Blue Gene/P Solution system housed at the Forschungszentrum Jülich (FZJ) in the Nordrhein-Westfalen state of Germany, ranked as the second largest supercomputer in the world with a peak performance of 167.3TFlop/s, according to the latest Top500 list published in November 2007.

The FZJ is also home to another top-ranked German supercomputing site, the JUBL or Jülicher Blue Gene/L , which is ranked at 28 in the Top500 list and offering a peak performance of 45.8 teraflops, equivalent to 45.8 trillion operations per second.

There are many other German sites housing HPC power for scientific research. The Leibniz Computing Centre (Leibniz-Rechenzentrum, LRZ) provides services to the scientific and academic communities in Munich, including a technical and scientific high performance supercomputing centre for all German universities.

The HLRB-II system, ranked 15th in the world, began operating in Q3 2006 in the new LRZ building in Garching, replacing the former national supercomputer system Hitachi SR8000-F1 with an SGI Altix 4700 system model.

The LRZ offers HPC on several different levels, including the HLRB-II with a peak performance of more than 62 Tflop/s, and also by operating two more supercomputers (a 128 CPU SGI-Altix 3700 Bx2, and a 256 CPU SGI Altix 4700), plus an Itanium2-based Linux cluster, which can be accessed by researchers from all Bavarian universities, and an Intel IA32-based Linux cluster, which can be accessed by researchers from the Munich universities.

And, maybe quite aptly named, the Genius supercomputer at the Rechenzentrum Garching (RZG) of the Max Planck Society and the IPP is from the IBM BlueGene family and is ranked 40th in the current global Top500 list. The RZG began life as the computing centre for the Max Planck Institute for Plasmaphysics (IPP), which was founded in 1960. From 1980 RZG acted as a common computing centre of the IPP and other Max Planck Institutes on the Garching campus and nearby Munich.

Since 1992 RZG has operated as a common computer centre of the IPP and the Max Planck Society, offering services for Max Planck Institutes all over Germany. HPC power and support is given to Max Planck Institutes with high-end computing needs in areas such as fusion research, materials sciences, theoretical chemistry, polymer research, astrophysics, earth sciences and other fields.

Much of Germany’s HPC power is dedicated to scientific research.

Science simulations

Over at FZJ, the home of the second most powerful supercomputing site in the world, the BlueGene machines are being put to good use by in-house and external scientists. Dr Robert Jones, a scientist working at the Jülich Centre, is using the resources of the HPC facility to investigate two phenomena. The first strand is within materials science, which is Jones’ primary research interest and where a large number of simulations on a wide range of materials are carried out, with Jones mainly concentrating on alloys of Germanium (Ge), Antimony (Sb) and Tellurium (Te).

These alloys are phase-change materials, which are already widely used for optical recording (including in Blu Ray discs) and computer memory (with DVD-RAM already becoming a commercially successful application of such an alloy).

The Ge-Sb-Te alloys undergo a phase change from an amorphous state (which is where there is no long-range order in the positioning of the atoms in the alloy) to an ordered crystalline state. Phase-change materials switch between amorphous and ordered states when an electric current or laser pulse is applied, and the state can be determined by monitoring the optical or electrical properties of the alloy. Despite phase-change materials being widely used within industry, only sketchy information is available about the structures of the phases involved[1].

Jones has used the IBM BlueGene/L and BlueGene/P computers at Jülich to perform extensive simulations on the amorphous and ordered states of Ge-Sb-Te alloys, some of which are already used within DVD-RAM. The simulations apply density functional calculations, which use a quantum mechanical theory used in physics and chemistry to investigate the electronic structure of many-body systems, in particular atoms, molecules and the condensed phases. But this is not an easy thing to do, as Jones says: ‘Density functional calculations can be messy and time consuming and can only be done on supercomputers.’

The density functional calculations do not use adjustable parameters, but their demands on computational resources have restricted previous simulations on Ge-Sb-Te systems to relatively small unit cells (of the order of 100 to 200 atoms) and the time scales used are often much shorter than those relevant experimentally[2,3]. Jones is currently working towards using Jülich’s HPC power to simulate 1,000 atoms within a 1ns timeframe, which would be a first within the branch of materials science. He adds: ‘We are simulating these alloys to work out why they work, not just that they work.’

The second area that Jones is investigating is simulating biological molecules, with his research centring on a bacterium known as Staphylococcus aureus near the outer layer of a cell. Classical force fields are used to simulate how this bacterium reacts with ATP (adenosine-5’-triphosphate), which is the main source of energy in mammals.

Jones has stayed in Jülich for 35 years, having completed a PhD in Cambridge and research in various other locations including the US. Although he intended to return to the US, Jones has found that the German supercomputing scene has been a good place to conduct his research, as he says: ‘I stayed here because the research facilities, the research which is taking place and the support I have received from my boss and other staff here has been quite unique.’

It’s not just the research and staff that make Jülich an exceptional place for scientists to work either, as Jones adds: ‘We have access to a machine that makes calculations possible that most other people find impossible to do. I knew that my competitors could not keep up, and that’s a good feeling.’

Jones is one of around 4,400 staff employed at the Jülich centre, with the centre’s resources split between the internal scientists employed there (of which there are around 1,500, including PhD students and fellows) and external users at other European research centres and universities. Jones says: ‘The Jülich centre is a national facility with around 35 per cent of the computational power being used by on-site scientists.’

‘We have less time than users coming from outside of the centre, but there are also fewer of us. However, it can be an ongoing fight to make sure we are not overlooked and get enough time on the machines – but I think you’ll find that situation in other HPC centres,’ Jones adds.

With the JUBL supercomputer in the background, Chairman of the Board of Directors of Research Centre Jülich, Achim Bachem (left), and CEO of IBM Germany, Martin Jetter (right), sign the contract for the installation of the new 220-teraflop supercomputer in Jülich.

Dr Erik Koch is also using the facilities at Jülich to try to understand the properties of a different set of materials from Jone’s Ge-Sb-Te alloys. These include high-temperature superconductors, magnetoresistive materials and metal-less metals, which are organic crystals that can conduct electricity.

These materials have to be simulated in a different way from the alloys as the density functional theory breaks down due to the complexity of the structures, as Koch explains: ‘The physics of these systems is such that we cannot use mean-field approaches, but have to confront the full many-body problem.’

But the many-body problem, which is the true and therefore more complex way to study the effects of interactions between bodies, brings with it a new set of challenges. For example, if a physicist wanted to describe the many-body state of a single atom of iron, then a whopping 1078 numbers need to be stored on the system simulating that atom. Koch says: ‘To realise how hopeless such a brute-force approach is, you have to realise that this number corresponds to the number of atoms in tens and tens of galaxies.’

Koch adds: ‘Although, in principle, we know what to do, in practice it is impossible to deal with such numbers so we need to simplify the problem and use a model description that is simple enough to handle it but can also describe the material accurately, and this is where HPC comes in.’

The Bluegene machines at Jülich have provided a big boost to Koch’s research as the increased power means more accurate simulations can be achieved, as he says: ‘The physics behind these materials is so difficult to simulate that we just reached the threshold to achieving realistic simulations.’

And Koch is finally able to get quantitative results using the supercomputing machines, as he explains: ‘The previous supercomputers were simply not able to deliver the computing power for such calculations. Only massively parallel machines like Bluegene allow us not to just make a rough model of what is going on qualitatively, but to finally get quantitative results for these materials.’

Koch is hoping that increasingly complex structures could be simulated in the future, for example by moving from modelling bulk materials to simulating structured devices: ‘Similar to the breakthroughs within simulations using density functional theory 30 years ago, we hope to be able to scale up our simulations to be ever more realistic, and maybe describe the more complex interfaces and not just homogeneous materials.’

And Koch is also optimistic that Jülich will be able to scale up its HPC power to cope with simulating such interfaces: ‘At the moment, they are just about impossible to do but they should become feasible in the near future.’

The Jülich centre is part of the wider Helmholtz Association, which has 15 research centres and an annual budget of around €2.35bn, making it one of the largest scientific organisations in Germany. There is also a major HPC presence within the Helmholtz Association, to aid its scientific research into topics from astrophysics to cell research to particle physics.

DESY (Deutsches Elektronen-Synchrotron), another research centre of the Helmholtz Association, provides access to special purpose computers, which are optimised for applications from elementary particle physics. These computers have been developed by an Italian-German and later an Italian-German-French collaboration and the resources on these computers are made available via the John von Neumann-Institute for Computing.

The research performed on the parallel computers at DESY aims to understand properties and interactions of quarks and gluons, which are the building blocks of particles such as the proton and the neutron. In large-scale simulations a discretised formulation of the relevant theory, called quantum chromodynamics (QCD), is being investigated and this version of the theory is called lattice QCD.

Dr Dirk Pleiter is investigating QCD theory using the facilities at DESY, and the HPC power provided by the centre is crucial within his research, as he explains: ‘There is increasing evidence that quantum chromodynamics is the correct theory for strong interactions, one of the known fundamental forces in nature. Many interesting results can so far only be obtained from first principles by means of numerical simulations. For our applications the availability of strong scaling computer architectures, i.e. machines on which performance does not deteriorate when increasing the number of processors, is a key issue.’

He adds: ‘Progress in simulations of QCD is to a large extent limited by the availability of capability computers. Fortunately, our applications are well suited for parallelisation on a large number of processors. Access to machines where applications can be executed on thousands of nodes is only available within HPC centres.’

And lattice QCD also relies heavily on supercomputing, as Pleiter says: ‘Research on lattice QCD is only possible because of computers in the multi-teraflops range being available.’

The DESY centre is an exceptional place within Germany, according to Pleiter, who adds: ‘This centre is special in the sense that it provides access to special purpose computers and that it is involved in the development of such computers.’

Simulations in Jülich: a laser beam hits gold foil and generates fast particles. This is basic research for compact particle emitters in technology and medicine.

Future collaboration

Using HPC centres is clearly a must for the modern-day scientist, with most unable to complete their research without them, as Jones says: ‘I’ve been doing calculations on solids, liquids and molecules for most of my career and as soon as supercomputers became available, I was there. So, how has high performance computing helped my scientific research? It has made it possible.’

And HPC simulations are comparable to other massive science experiments due to their size, cost and that they can only be used to answer certain scientific questions, as Jones adds: ‘I see a facility like Jülich as comparable to a nuclear reactor or accelerator producing particles or x-rays. Reactors are expensive to build and maintain and are used by a relatively small number of people, but the neutrons they produce are needed to answer certain scientific problems. Likewise, an HPC centre is needed for other problems.’

But as the sizes, and therefore costs of building and maintaining such HPC sites for scientific research snowballs, the only way for German supercomputing is up, according to Dr Pleiter, who says: ‘Scientists in Germany have a lot of opportunities to carry out their research. In February the Jülich Supercomputer Centre unveiled Jugene which, at the time of inauguration, was the fastest civil supercomputer in the world. Furthermore promising projects for building new generations of special purpose computers are on the way, which will result in additional fast computing resources.’

This air of optimism is echoed by the scientists at Jülich too, who also feel European collaboration will be needed to build and manage increasingly large HPC services, as Dr Jones adds: ‘The next generation of supercomputers will be very expensive and will therefore require cooperation between European countries to find the funding to make them possible. I expect Jülich will keep its place near the front of the queue within German, European and global supercomputing.’



1. R. O. Jones and J. Akola, Nanoscale Phase Transitions in Phase Change Materials IFF Scientific Report 2007, pp. 158-159 (2008).

2. J. Akola and R.O. Jones, Structural Patterns in Ge/Sb/Te Phase-change Materials in 'NIC Symposium 2008, Proceedings', G. Münster, D. Wolf, and M. Kremer (Editors), John von Neumann Institute for Computing, Jülich, NIC Series Volume 39 ISBN 978-3-9810842-5-1, pp. 179-186 (2008).

3. J. Akola and R. O. Jones, Structural phase transitions on the nanoscale, Phys. Rev. B 76, 235201 (2007).

Media Partners