How do we make exascale happen?
Prompted by the International Supercomputing Conference (ISC’14) held in Leipzig at the end of June, Tom Wilkie takes a look at how Governments drive the development of exascale – without explicitly admitting that they are subsidising industry.
Everyone in the high-performance computing industry agrees that the next goal is to reach exascale – not so much because it will be nice to have a machine capable of a million, million, million floating point operations per second (although it will!), but because the technologies that have to be developed to make exascale happen will make petaflop computing (a thousand million, million flops) cheap and widely available. And that will be a shot in the arm for industrial development and economic growth.
As Bill Harrod, director of Advanced Scientific Computing Research at the US Department of Energy (DoE) and thus the man primarily responsible for driving forward the US exascale programme, told the International Supercomputing Conference (ISC’14) in Leipzig at the end of June: ‘The people who have to design new products, or map hurricanes and tsunamis, know that high-performance computing delivers a tremendous benefit.’ Or in the words of Intel’s marketing slogan at last year’s conference (ISC’13): ‘To compete, you must compute’.
France has just drawn up its own national plan for high-performance computing, a major goal of which is to ensure that digital simulation is available to small and medium companies, and not just larger ones, as reported here..
But just how do we make exascale happen? Not so much the technical specifications – the technological solutions that will actually make the systems work – but rather how do we create the economic conditions that will persuade commercial companies that there is a profit to be made for their shareholders in developing these technologies and incorporating them in their products? At ISC’14, some of the answers began to emerge and the extent to which Governments are prepared to act as midwife to exascale might come as a surprise to some observers.
After all, the West lives in a post-Reaganite/post-Thatcherite economy where Government and its taxes are – allegedly -- a burden on the backs of enterprising companies, so that the most economically efficient role of Government is to retreat from any intervention in the operation of market forces, to lower taxes on business, and to give entrepreneurs free rein. It is this logic that last year led the US Congress to ‘sequester’ the US Government Budget, so as to cut public spending by a total of approximately $1.1 trillion over the eight-year period from 2013 to 2021.
In public, the US still subscribes to Ronald Reagan’s sardonic view: ‘The most terrifying words in the English language are: “I'm from the government and I'm here to help”.’ But regardless of this public rhetoric, the United States Government is subsidising the development of next-generation supercomputers -- as it has done consistently over the past 50 years since modern supercomputers first emerged in a recognisable form (although the term ‘supercomputer’ itself is of later coinage). And in the run up to exascale other Governments are following suit.
There are three ways in which Governments are forcing the pace of technological development. One is by international research cooperation – usually on projects that do not have an immediate commercial product as their end-goal. A second is by funding commercial companies to conduct technological research – and thus subsidising, at taxpayers’ expense, the creation or strengthening of technical expertise within commercial companies. The third is subsidy by the back door, through military and civil procurement contracts.
Aside from the Chinese, the Europeans are perhaps the most explicit in using Government funds to direct commercial activity (dirigisme is, after all, a French word that the English language has found convenient to borrow). The European Union’s Horizon 2020 research and development programme specifically envisages ‘public-private partnerships’ in setting up centres of excellence in high-performance computing, under the aegis of the industry-dominated European Technology Platform for HPC. The efforts to help Europe achieve leadership in high-performance computing as a result of tripartite cooperation between industry, academia, and the European Commission were discussed at PRACEdays14, in Barcelona at the end of May, and reported here.
In an interview with Scientific Computing World during ISC’14, Pete Beckman, Director of the Exascale Technology and Computing Institute at the Argonne National Laboratory, provided a US perspective on the European programme: ‘Horizon 2020 is very exciting and I’d love to collaborate, but we have to define where we can collaborate and where we should compete.’ Care had to be taken, he said, because the central focus of the European programme was so very much on industry building.
Past history showed that the French Government, in particular, was very directly supportive of its industry, whereas in the USA the model has been for the Department of Energy to fund basic research – the science. However, US Government policy certainly did not exclude placing contracts directly with commercial companies. ‘If we have to pay the vendors to get the research done, then we will pay them,’ he said.
A by-product of getting the research done is, of course, that the vendors strengthen their human resources in terms of technical expertise – a similar outcome, albeit achieved by a more indirect route, to the one that the Europeans hope to attain explicitly through the creation of their centres of excellence.