June/July 2010

FEATURE

A different approach to data

In technical computing, two converging trends at the moment are leading to an increase in data intensity: firstly, the sheer amount of data being collected with the mass deployment of sensor networks and analytical equipment, and secondly, there are larger and more complex simulations across a range of scientific disciplines.

FEATURE

Turbulent times

When you study aerodynamics or indeed any fluid flow, in most cases you’re actually studying turbulence. Laminar (‘smooth’) flow is not terribly interesting in the real world, whereas turbulent flow is very common in nature and occurs nearly everywhere: in rivers, in the oceans, around our cars and airplanes, even in stars and galaxies.

FEATURE

The next generation

The University of Southampton has an excellent pedigree in HPC and has been operating supercomputers for the past three decades. Its latest machine, Iridis 3, the third installation of the Iridis cluster, was officially launched in February 2010 and provides a substantial improvement in performance, giving researchers 20 times more computational power to play with than its predecessor.

FEATURE

Looking for the silver lining

Cloud computing means many different things to many different people, with so-called ‘cloud’ offerings ranging from simple online storage services to a handful of providers of supercomputing-as-a-service, with many variations in between. While the space is poorly defined, cloud-based services are rapidly becoming an established part of the HPC hardware used by a broad spectrum of users, from science and engineering to big business.

Like a grid?

FEATURE

Super seismic

Drilling for oil is a tough business, and it’s only going to get tougher as global reserves begin to dwindle. Only a fraction of a per cent of the planet’s crust contains oil and, for any given reserve, only 10 to 60 per cent will be recoverable. It’s important for the corporate giants producing oil to open new wells to replace those that run dry, and so promising areas of land and sea are constantly being probed in order to locate new supplies.

FEATURE

Informatics is just the tonic

The pharmaceutical industry has always been a competitive and fast-changing field. Drug discovery, which traditionally was based around small molecule chemistry (a classic drug like aspirin is composed of 21 atoms), now has to deal with the much larger proteins produced by biotechnology. Monoclonal antibodies, which fall under the classification of biologics, can be in the region of 20,000 atoms, three orders of magnitude larger. Laboratory practices are changing, with the move from paper to electronic records.

FEATURE

Only connect

Once upon a time I spent several weeks, completely alone (long story; don’t ask), on top of a mesa with a precarious microecology, surrounded by desert. My only company was the local wildlife: mostly lizards, small rodents, and an unlikely colony of feral cats at the top of the food chain.

Feature

Gemma Church finds out how astronomers are using simulations to investigate the extremities of our universe

Feature

Turning data into scientific insight is not a straightforward matter, writes Sophia Ktori

Feature

The Leibniz Supercomputing Centre (LRZ) is driving the development of new energy-efficient practices for HPC, as Robert Roe discovers

Feature

William Payne investigates the growing trend of using modular HPC, built on industry standard hardware and software, to support users across a range of both existing and emerging application areas

Feature

Robert Roe looks at developments in crash testing simulation – including larger, more intricate simulations, the use of optimisation software, and the development of new methodologies through collaboration between ISVs, commercial companies, and research organisations