Germany has increased its presence on the global supercomputing stage, having unveiled one of the world’s largest high-performance computing sites in the last few months and dedicating much of the country’s HPC power to scientific research.
In the process known as rapid prototyping, also known also as additive fabrication or rapid manufacturing, users start with a 3D image on a screen and ‘print’ the object from materials such as plastic or metal. One obvious application is to make a prototype of a new part before going into production, but also imagine being able to make replacement parts for a pump that went out of production 50 years ago, using old CAD data.
The title, as I’m sure you have already recognised, is from a well-known strand in schooldays humour. This particular example came from a medical student, and I’ll keep the punch line until later.
From the submicroscopic level to international social policy, medicine has become a scientific computing dominated domain. Electronic data handling approaches are centrally responsible for increases in reach, effectiveness and efficiency, though also for dramatic growth in health programme delivery costs.
At one time there were only two types of scientists, the experimentalists and the theoreticians. Some argue that using computational models has led to the emergence of a third class of scientist somewhere between the two. Peter Coveney was an early pioneer of using computational methods to connect the behaviour of matter on a small scale to the observed macro properties of fluids. This has set him on the road to making simulations that are so large scale they are capable of describing the behaviour of an entire system.
There have been massive changes in our market since that first issue; over the following pages, we asked some of the leading names in scientific computing - indeed in some cases, they are originators of software packages that have become essential to researchers around the world - about exactly what those changes have been, and what they have meant for the industry. We also asked them to reflect on the different challenges faced back then compared to the ones we face now.
Felix Grant, SCW contributor
‘Medical researchers generally believe that for their studies to be credible they need a primary group of nearly 4,000 patients, and for validation they must replicate the work with roughly 20,000 patients. Where else but with a biobank can they find this number of samples?’ This summary of the need for biobanks comes from Professor Joyce Carlson, laboratory manager for clinical chemistry and pharmacology, University Hospital in Lund and a member of the planning committee for the Swedish LifeGene national biobanking project.
In March 2008, after the exposure of an animal cruelty scandal at a California-based abattoir, the US Department of Agriculture (USDA) recalled 143 million pounds of beef from the market. This was the largest beef recall in US history and, although it drew hardly any media attention outside that country, it recalls food safety crises of earlier decades – including, in the UK, the BSE crisis of the 1990s.
Gemma Church finds out how astronomers are using simulations to investigate the extremities of our universe
Turning data into scientific insight is not a straightforward matter, writes Sophia Ktori
The Leibniz Supercomputing Centre (LRZ) is driving the development of new energy-efficient practices for HPC, as Robert Roe discovers
William Payne investigates the growing trend of using modular HPC, built on industry standard hardware and software, to support users across a range of both existing and emerging application areas
Robert Roe looks at developments in crash testing simulation – including larger, more intricate simulations, the use of optimisation software, and the development of new methodologies through collaboration between ISVs, commercial companies, and research organisations