Skip to main content

Supercomputing tech accelerates science

Supercomputing is speeding up scientific research with biologists developing a range of applications to buff up their simulations, according to researchers at this year's ISC conference.

The ISC conference gave scientists the chance to talk about their experiences using and developing HPC, and the benefits the technology has provided within their work. Biological research was a hot topic among the speakers at the scientific sessions, with the discipline putting increasing demands on the supercomputing industry. Josep Gelpi from the National Institute of Bioinformatics said: 'The amount of data in biology is growing faster than most supercomputers so we need more power.'

Gelpi is using a BioMoby-based workflow for gene detection using sequence homology techniques. The BioMoby offering is a web-based service and has been a popular choice for Gelpi, as he said: 'Web services provide a central repository, have well known input/output formats, are well controlled and there is no need for user interfaces. This has been a natural process instead of an imposed service.'

A group from Bristol University has used accelerators to increase its HPC power and aid its research into drug design. The team is simulating a drug therapy for emphysema and focusing on a drug that comes as a range of peptide-based elastase inhibitors. The team wants to work out the reaction between this drug and its target enzyme and needs to figure out the exact orientation and interaction energy, which will produce the most successful interaction to fight the disease.

The team is using its in-house sytem called BUDE (Bristol University Docking Engine), which simulates atom-to-atom interactions using experimental data to parametrise the simulations. The system is helped out by accelerators, which process the atom-to-atom interactions to calculate the energy of each process.

The accelerators have sped up the simulations, as Dr Richard Sessions of Bristol University said: 'We have seen a 21-fold speed up using the accelerators when simulating 12 processes.' BUDE has completed a synthesis of 40 peptides, all mixed together in one virtual pot, to test for inhibitory action. The team is currently in the process of working out which one of the 40 drug candidates is the best one.

And scientists are using HPC in increasingly unusual ways, accoring to S. Turek from the Institut fur Angewandte Mathematik at TU Dortmund, who spoke about some unconventional HPC for finite element simulations. Turek said: 'It is getting harder and harder to be successful within HPC as applications no longer run faster automatically on new hardware.'

Turek added: 'HPC has to consider recent and future hardware trends, particularly for heterogeneous multicore archs and massively parallel systems.'

Pawan Balaji, from the Argonne National Laboratoy, spoke about his experiences using ParaMEDIC, a general software-based framework for large-scale distributed computing and is currently running on nine supercomputers around the world. The data generated at these sites, which includes masses of genomic data, is then moved to a data storage unit in Tokyo.

The ParaMEDIC project came about because of a need to keep up with the huge amount of data generated from gene identification. Balaji said: 'The genome database size doubles every 12 months, but the compute power of a single supercomputer only doubles every 18 to 24 months and as a consequence of this we developed a distributed supercomputer resource that takes the compute resources for multiple centres, puts that data into storage and then lets it be used later on.'

Biology was not the only scientific discipline to air its HPC experiences as Dr Carsten Aulbert from the Max-Planck-Institut fur Gravitationsphysics spoke about the Gigabit-based cluster used within his research to try and detect gravitational waves. These waves were originally predicted by Einstein as ripples in space-time and are very difficult to detect, making data analysis searching for signatures of these elusive waves even more difficult.

The team has to wade through hundreds of Terabytes of data and wanted to build a relatively simple cluster which maximised the available CPU performance for its codes. Aulbert said: 'We wanted to avoid using complex systems, which are more likely to fail and come with higher costs, both to run and repair. So Gigabit is not a dead technology if we want to bring down costs or do not want to use Infiniband technology.'

Media Partners