I am Dean of Informatics at the University of Westminster and I am head of the HPC research team (which is managed by the university’s Centre for Parallel Computing).
The biochemical networks that mediate life are driven by the intricate interplay of biological entities such as genes, proteins, and metabolites. A comprehensive understanding of any given biological condition cannot be achieved by studying these entities separately. Systems biology is a study to understand both the individual functions and interactions of genes, proteins, and other biological components together as a single system. It will transform our understanding of the underlying mechanisms of human diseases.
An air of collaboration surrounds the scientific supercomputing projects going on in the north of Europe, with countries keen to take part in the everlengthening list of pan-European projects. And Denmark, Finland, Norway and Sweden have even set up their own group project, known as the Nordic Data Grid Facility, to help coordinate their grid computing resources.
As engineers discover the power of computer simulation, they want to work on ever-larger models, today some having hundreds of millions of degrees of freedom (DOFs). This desire is going hand-inhand with the evolution of HPC systems, whether multicore processors, multiprocessor servers or clusters. In the past we could count on higher CPU clock rates to speed up analyses, but power consumption and heat became a problem. The solution was to put multiple cores on one piece of silicon and step back clock speeds.
Ecology has come a long way from its first emergence as a distinct field of study in the 1970s. Like me, over the same period, it has moved from cuddly hippiedom to a strongly data-centred, systems view of the world.
Protecting the environment is big business. With climate change high on the political agenda of governments around the world, legislation controlling how we treat this precious resource is constantly being tightened. The Kyoto Protocol, for instance, sets binding targets for reducing a country’s greenhouse gas emissions, forcing governments committed to the agreement to evaluate methods for curbing their emission levels.
The earth, with its mountains, large bodies of water, volcanoes and other natural formations, is a laboratory where all physical effects are coupled. Further, modelling is extremely necessary here because of the timeframes in which we sometimes work. They can extend to months, years or even decades, and that also makes it difficult to verify a model. However, that’s not stopping geologists from developing and improving models that predict every aspect of our constantly changing earth.
Albert Einstein was not exaggerating when he said that his gravitational field equations presented ‘very serious’ mathematical difficulties. In fact, it took nearly 90 years from the publication of his paper on general relativity to the first successful solution of the two-body problem: the merger of two black holes. The problem occupied some of the finest minds in mathematics, as well as some of the biggest iron in computing. Even then, the effort to solve these equations has been a long and hard road and not one for the faint-hearted.
Gemma Church finds out how astronomers are using simulations to investigate the extremities of our universe
Turning data into scientific insight is not a straightforward matter, writes Sophia Ktori
The Leibniz Supercomputing Centre (LRZ) is driving the development of new energy-efficient practices for HPC, as Robert Roe discovers
William Payne investigates the growing trend of using modular HPC, built on industry standard hardware and software, to support users across a range of both existing and emerging application areas
Robert Roe looks at developments in crash testing simulation – including larger, more intricate simulations, the use of optimisation software, and the development of new methodologies through collaboration between ISVs, commercial companies, and research organisations