Skip to main content

Running a railway - lessons for supercomputing?

By the end of March, a thousand-strong ‘orange army’ – so called because of the distinctive colour of their weather-proof safety clothing – should have re-opened the train line that hugs the Devon coast at Dawlish, portions of which were washed into the sea by the storms that hit the UK in February. They have been working night and day to repair one of the most beautiful stretches of railway line in the world, designed by Isambard Kingdom Brunel and opened in 1846. But the motivation is not just aesthetic, this mainline track connects London and the far south-west of Britain and remains vital to the economy of the nation.

One of the most important uses of supercomputers is in modelling climate change and weather forecasting. The programme of ISC’14 includes presentations from Oliver Fuhrer, the HPC lead for Swiss weather modelling, on ‘Predicting Extreme Weather’ and from George Mozdzynski at the European Centre for Medium-Range Weather Forecasts on the challenges of getting its weather forecast model to exascale.

But that twisted and bent railway line has a metaphorical significance for high-performance computing. After 160 years, it still performs a vital economic function. The current time-horizon of HPC is not much more than five to 10 years, with exascale as the goal. Can we ensure that supercomputing embeds itself so deeply into the productive economy as to be indispensable a century from now? And will it be regarded as a thing of beauty, rather than just a technical artefact? Is exascale the only destination?

Modern science would be impossible without harnessing powerful supercomputers, in modelling and simulation and in processing the data generated in such huge quantities by the international collaborations that carry out projects in particle physics and astronomy.

Applications of HPC to physics and astronomy are strikingly absent from the programme of ISC’14 published so far. Instead there is a great emphasis on biology and medicine.

Exascale machines will be expensive to build and to run – the energy constraint is well known and there are ingenious efforts to try to overcome this obstacle. Alex Ramirez from Barcelona will be considering imaginative hardware options to circumvent the problem, while Robert Wisniewski from Intel will be looking at ‘Advancing HPC software from today through exascale and beyond’. To national laboratories and Government-funded researchers, this expense will be a constraint but ultimately, if the research needs the compute power, then money will be found for the electricity bill.

However, that railway line was built not by Government but by a commercial company seeking to profit from offering a service to users of its technology. And the existence of that technology encouraged new industries to spring up, and allowed existing ones to expand.

Will exascale machines, which are so expensive to build and to run, have the same appeal to a wide range of industries in the 21st century? Or will it be only the big multi-national corporations, which can afford huge capital outlays, that will be users of this technology? At first sight, indications that this might be the case are visible in the ISC’14 programme, with the contribution by Tate Cantrell, CTO Verne Global, and Susanne Obermeier, Global Data Centre Manager at BMW, explaining why BMW moved its HPC applications to a data centre in Iceland. There is also a session in the programme on ‘Real-life value of HPC’ but it too concentrates on the larger scale applications, not on the widespread diffusion of HPC out into the wider community.

However, the two-day track on ‘Industrial innovation through HPC’ takes up the challenge of setting out how computer simulations and digital modelling using high-end computing and storage can boost industrial companies’ productivity and competitiveness in the global market. It is explicitly intended to help engineers, manufacturers, and designers understand which tools and methods would help them solve their problems using HPC.

The organisers of this session acknowledge that, in the past, the design of HPC clusters was driven by considerations of the technology itself: CPU, interconnect, and network. Nowadays, for clusters to be useful to a wider range of users, it is necessary to understand the applications that will be run on the cluster just as much as the ‘infrastructure’ technology of the cluster itself.

The train-spotter analogy applies here also: after all, railway lines carry commuter and freight traffic, not just Inter-City Express trains. Speed was not all-important, even in the application of 19th century technology.

But of beauty, there is as yet no mention in the programme of ISC’14. The European Laboratory for Particle Physics, Cern, near Geneva, has for some years now operated a cultural policy for engaging with the arts. The laboratory believes that ‘particle physics and the arts are inextricably linked: both are ways to explore our existence – what it is to be human and our place in the universe. The two fields are natural creative partners for innovation in the 21st century.’

Perhaps supercomputer centres should also open their doors to an ‘artist in residence’, whose work might grace future meetings of the ISC?



22-26 June 2014, Leipzig, Germany


www.isc-events.com/isc14

Media Partners