In the current global economic environment, where margins are scrutinised for every possible opportunity for efficiencies and cost savings, companies everywhere are facing the necessity of a reduction in their capital expenses as well as their operating costs. Companies that rely on laboratory data are looking for ways to make the most of the mountain of data they generate, and turn it into useful revenue-generating information.
In the early days of aerospace design, most CFD analysis was done with in-house codes, but the situation is changing. For example, Doug Ball, chief engineer of aero characteristics and flight performance at Boeing, estimates that these days about 20 per cent of his company’s utilisation is with proprietary codes, 30 per cent NASA-originated codes and 50 per cent commercial software.
A universal tool
Hospital dramas are a staple of popular television, but although images of high-tech medicine form a backdrop to the human drama of staff and patients, no programme has yet featured a surgeon calling out: ‘scalpel . . . swab . . . supercomputer’. But HPC is already playing a life-enhancing role in the lives of patients undergoing kidney dialysis and promises to hasten the post-genomic era of individualised medicine.
Desktop HPC, personal supercomputers, personal workstations, whatever you call them, there’s no standard definition of what they are. But most suppliers generally agree on what they aren’t: standard office PCs. Simply having lots of cores just isn’t enough of a boost for the applications where desktop HPC is generally applied: typically in engineering design, analysis and visualisation.
Ever since the mid-1800s when the first North American tycoons struck oil, petroleum has become a vital commodity in modern society. With concerns over dwindling supplies, coupled with an energy-hungry global population, demand for the fuel remains high.
‘Wide is the ocean, sweet gravity...’ while that refrain from Cerys Matthews’ song1 Ocean is intended as poetic metaphor, it is also appropriate to a scientific computing view of things.
Gemma Church finds out how astronomers are using simulations to investigate the extremities of our universe
Turning data into scientific insight is not a straightforward matter, writes Sophia Ktori
The Leibniz Supercomputing Centre (LRZ) is driving the development of new energy-efficient practices for HPC, as Robert Roe discovers
William Payne investigates the growing trend of using modular HPC, built on industry standard hardware and software, to support users across a range of both existing and emerging application areas
Robert Roe looks at developments in crash testing simulation – including larger, more intricate simulations, the use of optimisation software, and the development of new methodologies through collaboration between ISVs, commercial companies, and research organisations