Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Contrary bets on a computational future

Share this on social media:

Who wants to bet on the pace of progress in high-performance computing? Two of the leading experts on the subject have placed friendly but contrary bets over when the technology would reach exascale the next big step, with computational speeds more than a hundred times that of the K computer from Japan that has just been named the fastest in the world.

At the International Supercomputing Conference (ISC) in Hamburg on 20 June, Horst Simon and Thomas Lippert doubled the stakes of a bet they had first made last year. Simon, from the Lawrence Berkeley National Laboratory in the USA had staked $1,000 that we would not see an exascale machine by 2019 the widely expected time for the first such computer to be operational. At the ISC, Thomas Lippert, from the Juelich Supercomputer Centre in Germany, was so confident that the technology would be available on that time-scale that he offered to double the stakes. If he is wrong, he now stands to lose Euro 2,000.

Dr Simon reviewed the US efforts to develop new supercomputing technologies. Inevitably the military was both interested in and had funding for development with programmes by both the Department of Defence and by DARPA. The US National Science Foundation was also active in pushing the boundaries of high-performance computing. But despite the multiple independent programmes mounted by different agencies of the US Government, Dr Simon drew a pessimistic conclusion. 'It's going to be very difficult to get to exaflop,' he warned. He felt that the lack of it was difficult to coordinate the different US Government projects and that budgetary constraints would hamper US efforts. Although the Obama Administration earmarked US $126M earlier this year for advanced computing, Dr Simon remarked that there was not enough budgetary room to develop the technologies required to reach exascale by 2019. The ethos seemed to be that 'if you want to do something new, then you have to kill something old,' he said. 'Everyone is talking about exascale but nobody is doing anything about it.'

The US was pushing ahead with petaflop initiatives and expected to have at least three machines working at speeds higher than 10 petaflops next year. The first to come online would be an IBM machine called Blue Waters, developed with DARPA funding. This would be followed by the Titan machine at Oak Ridge, a Cray computer with GPU acceleration. There was also expected to be a Sequoia computer operational at Lawrence Livermore by 2012, that will achieve 20 petaflops.

Professor Lippert, from Juelich, was more sanguine as he discussed the European Union's calls for proposals to move the EU towards exascale. 'Exascale has full community support,' he said. 'It offers opportunities for science and technology. But co-design is critical.' Three advanced computing projects had just been funded, he said: Mont Blanc; Cresta; and Deep. Combined with the European Exascale Software Initiative (EESI) and the technology infrastructure that was already being put in place by the PRACE project, he felt confident that the technology platform was being put in place that could lead to an exacale machine by 2019.