Skip to main content

Why supercomputing the stars neednt cost the Earth

Astrophysicists at the University of Exeter are using an SGI Altix ICE 8200 supercomputer to simulate massive celestial processes - but at a tiny cost.

The system is being used to simulate the formation of bodies such as planets, stars and galaxies, with its 1280 cores able to handle the extra physics needed to accurately model such huge events – which take place over equally huge timescales.

Professor Matthew Bate, professor of theoretical astrophysics at the University of Exeter, said: ‘There’s a lot of physics to put into one computer and what makes it even worse are the scales we have to deal with.’

Bate added: ‘We are talking about processes that take millions of years, could cover areas which are light years across and contain matter which is as heavy as hundreds, maybe thousands, of our Suns.’

The astrophysics group’s simulations had been run on the UK Astrophysical Fluids Facility (UKAFF), which was a shared resource that was no longer powerful enough to perform such large calculations, so the Exeter team decided to move things in-house. Bate added: ‘We needed this one system to put in the extra physics and go up to larger scales, and the UKAFF could not support this anymore.’

The department’s supercomputing system has been built up over the past year. In June 2007, the university got a prototype 128-core SGI AltixICE 8200 computer cluster, and was running benchmarks within 48 hours of the kit arriving and producing new science within a week.

In mid-September 2007 the prototype system was replaced by a 128-core production model, which used dual 2.33GHz Intel Clovertown quad-cores, one IRU with 16 nodes with eight GB per node, 128 GB of RAM and 11TB of usable disk space.

And on 21 December 2007 the department got an early Christmas present, as the system was further expanded to include 1152 more cores, nine IRUs with 144 nodes, running at 16GB per node and using dual 2.83GHz Intel Harpertown quad-cores.

The final system is one of the top five most powerful UK university supercomputers – with 1280 cores, 2.5TB of memory, a 14.2Tflop/s peak and disk storage of 30TB.

Two of the main cost saving - and eco-friendly - aspects of the system are its intelligent cooling system and diskless nodes with university estimating it costs around £60,000 per year in electricity to run the system.

Speaking at an SGI roundtable event in London last week, John Masters, EMEA marketing director at SGI, said: ‘Green solutions and speed of implementation are two of the key things customers are now looking for.’

The water cooling system uses free-cooling chillers which, when the external temperature is low enough, circulate the water but do not actively cool it, cutting down power costs. And the system also uses diskless nodes, which increase reliability and bring down the power costs.

Professor Bate added: ‘Running costs needed to be as low as possible as we still do not know where the money will come from [to keep the supercomputer up and running and] it needed to be used effectively and efficiently in the eyes of the funding bodies.’

And there are other ways the supercomputer could be funded, such as a batch queuing system with larger jobs getting priority over smaller ones. ‘The idea is that you should want to use the whole machine or a large chunk of it,’ Bate added.

Queuing systems are not the only way to fund the supercomputer, as Bates added: ‘In the long term it may be subdivided so, if someone can bring in 30 to 40 per cent of the running costs in, they would get 30 to 40 per cent of the machine.’

The astrophysics team is using the supercomputing system to simulate and model a host of celestial processes – including how stars, star clusters and planets form and modelling the transfer of radiation between bodies to help interpret observations from observing platforms such as the Hubble and Spitzer Space Telescopes and the Very Large Telescope ground-based observatory in Chile.

Images from the UKAAF and University of Exeter

Images from simulations run by the UKAAF and University of Exeter. To the left there is a cloud and star cluster at the end of simulation (which covers 266,000 years). Some stars and brown dwarfs have been ejected to large distances from the regions of dense gas in which the star formation occurs. For the right-hand image stars and brown dwarfs fall together into a cluster. The objects range in mass from nearly the mass of the Sun down to as small as six times the mass of Jupiter. A star with an edge-on disc is ejected, centre left.

The team have to use astrophysics codes to help get around the large size and time scales of their simulations, but these simplified codes can bring inaccuracies into the results because certain variables, such as magnetic fields and radiation transport between bodies, are missed out.

Over the last few years the Exeter astrophysics group have been developing techniques to add in magnetic fields and radiation transport to its code and are hoping to simulate celestial clouds containing hundreds or thousands of stars.

The Exeter astrophysics team is now putting in the extra physics into their code and is hoping to pip its competitors to the post, with other astrophysics departments around the world all competing to come up with the answers to some of the biggest questions in physics first.

Bate said: ‘We are all working on the same problems but it’s a matter of who can solve them the quickest [and] our new supercomputer should give us a bit more of an edge.’

The Altrix system is expected to begin producing results in mid 2008, with the team still scaling up their code to run a job over the whole system. The Altix ICE will also be used for condensed matter research in the physics department, and by the university’s Mathematics Research Institute.

And the future of HPC at Exeter won't just be confined to the maths and physics labs, according to Bate, who hopes that having a powerful supercomputer will increase HPC research in biology, geography and engineering and increase the astrophysics department’s collaboration with the UK MET Office.

Media Partners