October/November 2008

FEATURE

The systems biology dilemma

The biochemical networks that mediate life are driven by the intricate interplay of biological entities such as genes, proteins, and metabolites. A comprehensive understanding of any given biological condition cannot be achieved by studying these entities separately. Systems biology is a study to understand both the individual functions and interactions of genes, proteins, and other biological components together as a single system. It will transform our understanding of the underlying mechanisms of human diseases.

FEATURE

Leading northern lights

An air of collaboration surrounds the scientific supercomputing projects going on in the north of Europe, with countries keen to take part in the everlengthening list of pan-European projects. And Denmark, Finland, Norway and Sweden have even set up their own group project, known as the Nordic Data Grid Facility, to help coordinate their grid computing resources.

FEATURE

When models outgrow hardware, turn to HPC

As engineers discover the power of computer simulation, they want to work on ever-larger models, today some having hundreds of millions of degrees of freedom (DOFs). This desire is going hand-inhand with the evolution of HPC systems, whether multicore processors, multiprocessor servers or clusters. In the past we could count on higher CPU clock rates to speed up analyses, but power consumption and heat became a problem. The solution was to put multiple cores on one piece of silicon and step back clock speeds.

FEATURE

A clean bill of health

Protecting the environment is big business. With climate change high on the political agenda of governments around the world, legislation controlling how we treat this precious resource is constantly being tightened. The Kyoto Protocol, for instance, sets binding targets for reducing a country’s greenhouse gas emissions, forcing governments committed to the agreement to evaluate methods for curbing their emission levels.

FEATURE

Down-to-earth modelling

The earth, with its mountains, large bodies of water, volcanoes and other natural formations, is a laboratory where all physical effects are coupled. Further, modelling is extremely necessary here because of the timeframes in which we sometimes work. They can extend to months, years or even decades, and that also makes it difficult to verify a model. However, that’s not stopping geologists from developing and improving models that predict every aspect of our constantly changing earth.

FEATURE

Solving the mysteries of the universe

Albert Einstein was not exaggerating when he said that his gravitational field equations presented ‘very serious’ mathematical difficulties. In fact, it took nearly 90 years from the publication of his paper on general relativity to the first successful solution of the two-body problem: the merger of two black holes. The problem occupied some of the finest minds in mathematics, as well as some of the biggest iron in computing. Even then, the effort to solve these equations has been a long and hard road and not one for the faint-hearted.

Feature

Gemma Church finds out how astronomers are using simulations to investigate the extremities of our universe

Feature

Turning data into scientific insight is not a straightforward matter, writes Sophia Ktori

Feature

The Leibniz Supercomputing Centre (LRZ) is driving the development of new energy-efficient practices for HPC, as Robert Roe discovers

Feature

William Payne investigates the growing trend of using modular HPC, built on industry standard hardware and software, to support users across a range of both existing and emerging application areas

Feature

Robert Roe looks at developments in crash testing simulation – including larger, more intricate simulations, the use of optimisation software, and the development of new methodologies through collaboration between ISVs, commercial companies, and research organisations