Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Why governments need HPC and HPC needs governments

Share this on social media:

HPC is important for economic development according to several recent reports and, Robert Roe reports, announcements about next-generation US HPC systems could be made at SC14

The US needs to invest in high-performance computing (HPC) so that its industry can continue to stay competitive in global marketplaces, according to a report from the market consultancy company Intersect 360, entitled ‘The Exascale Effect: the Benefits of Supercomputing Investment for U.S. Industry.’

The work was carried out on behalf of the US Council on Competitiveness, a non-partisan, non-governmental organisation, composed of corporate CEOs, university presidents, labour leaders and national laboratory directors.

Its aim is to ‘set an action agenda to drive US competitiveness, while generating innovative public policy solutions for a more prosperous America’.

The new analysis comes just two months after a Task Force on High Performance Computing reported to the US Department of Energy that data-centric computing, in conjunction with exascale, will be one of the most important requirements of HPC within the next 10 years.

Intersect’s report states: ‘High performance computing is inextricably linked to innovation, fuelling breakthroughs in science, engineering, and business. HPC is viewed as a cost-effective tool for speeding up the R&D process, and two-thirds of all US-based companies that use HPC say that “increasing performance of computational models is a matter of competitive survival.”’

The report identifies HPC as a key tool for the US to stay competitive in many industries, not just on an academic and government research level.

It is based on the research conducted by Intersect360, consisting of 14 in-depth interviews with representatives of industrial HPC organisations and 101 responses to a comprehensive online survey of US-based companies that use HPC. Among the key findings, was the statement that: ‘US industry representatives are confident that their organisations could consume up to 1,000-fold increases in computing capability and capacity in a relatively short amount of time.’

Increasing the computational power of today’s petaflop systems, by several orders of magnitude, may be the easiest route to added value for industry, but there are several technology roadblocks that prevent the current technology from being scaled as HPC users do today.

Software scalability, or the ability to parallelise code across potentially tens of thousands of cores is ‘the most significant limiting factor in achieving the next 10x improvements in performance, and it remains one of the most significant factors in reaching 1,000x,’ says the report.

This is a complex task, which can take several months on the petaflop systems of today. It was reported that a simulation code called ‘Alya’ was scaled to more than 100,000 cores earlier this year, on the system housed at the Barcelona Supercomputing Center (BSC). The porting and optimisation of Alya took several months but the researchers were able to achieve ‘more than 85 per cent parallel efficiency’ scaling the code to run on the Blue Waters, a system based on a CrayXE6.

Whether it is for increasing the complexity of models for simulation-based industries or increasing the throughput of data, for the life sciences or a finance-centred business, software must be scaled effectively to make use of the cores as efficiently as possible.

But significant investment must be found if these technology barriers are to be overcome. Traditionally the largest investment in HPC has come from government research grants, for example from the US national labs. The report goes on to recommend that the ties between industry and the US government be strengthened to expedite the transition to exascale computing.

The report states that the in-depth qualitative answers indicated: ‘There is more work that can and should be done in order to provide more direct benefit to industry from government investment in scalability and expertise.’

Furthermore, 56 per cent of respondents to the Intersect 360 survey agreed that ‘the work done by national government research organisations can “act as major driver for advancing HPC technology, leading to products and software that we will use in the future”.’

The report adds to the growing chorus of voices promoting the benefits of HPC, and eventually exascale computing, as necessary to many different industries and calling for government investment to facilitate the process. Announcements of the next generation Coral systems are expected later this year, possibly at the same time as the SC14 Supercomputing show in New Orleans in November.