FEATURE
Tags: 

HPC connects in Denver

Robert Roe reports back from SC17 conference on high performance computing

In November the supercomputing industry arrived in Denver Colorado for SC17, the largest US conference on high performance computing. The theme for this conference was HPC connects, and this was clear from the thousands participating from more than 70 countries across the globe.

The theme also rings true of the challenges that HPC must solve over the coming years. From the keynote detailing plans for the Square Kilometre Array to exascale systems, the challenges facing scientists and researchers have grown to the point that many of them cannot be solved by a single researcher or even one organisation.

Driving forward scientific research and discovery increasingly requires collaboration between researchers just in the same way that creating exascale supercomputers requires hardware and software providers to work closely together.

Before the keynote address, SC17 general chair Bernd Mohr took to the stage to highlight importance of HPC in furthering scientific research.

‘High performance computing is at the forefront of a new type of gold rush, a rush to discovery using an ever growing flood of information and data,’ said Mohr.

‘Computing is now more essential to science and discovery than ever before. We are the modern pioneers pushing the boundary of science for the betterment of society,’ he added.

This year the conference saw more than 2,800 participants from 71 countries including 122 exhibitors. ‘153 student volunteers congregate from every continent around the globe except from Antarctica – and trust me we are working on that,’ commented Mohr.

For many years SC has encourage participation from students as each year they invite student volunteers who get to access the conference and exhibition and the event also hosts the Student Cluster Competition (SCC). The SCC aims to partner HPC vendors with student teams of budding computer scientists.

While computer science curriculum has now more likely to include HPC education the original intent behind the SCC was to provide students with early exposure to HPC and motivate undergraduate curriculum development. The competition also exposes teams to an HPC work environment, requiring teamwork and broad understanding of systems, software, and applications. At SC17 the cluster demonstrated a wide variety of cluster configurations from single node GPU clusters to an eight-node system from Illinois Institute of technology and the University of Texas at Austin.

All participants opted for GPUs this year with each team using either V100 or P100 GPUs from Nvidia. This year also saw the use of IBM Power 8 processors, and the team from Illinois choose to use AMD CPUs for their system. Teams can use any configuration of hardware, but are capped at 3,000 watts for the total system.

This year the winner was Nanyang Technological University, which achieved a Linpack score of 51.8 Teraflops.  The students opted for a two-node system with a total of 88 cores combined with 8 V100 GPUs and an EDR interconnect.

The SCC is a breeding ground for the next potential HPC experts that will be helping to shape the future of HPC systems. As with the theme for the event in today’s world HPC is a truly global venture that requires collaboration from researchers, scientists and students from across the globe.

‘This international shift is reflective of science and research. We are no longer lone scientists from our home organisations. Our community now runs experiments with massive global collaborations,’ said Mohr. ‘This paradigm requires us to bring together geographically distributed computer systems, scientific instruments and the brilliant minds of our community in an unprecedented way.’ 

Other tags: 
Feature

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process

Feature

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Feature

This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.

Feature

This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value

Feature

This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment