Skip to main content

SC14 Test of time award goes to Sandia researchers

With SC 2014 approaching fast, the organisers have announced the second annual ‘Test of Time’ award for a technical paper that has had a ‘wide and lasting impact’ in supercomputing.

This year the award has been won by Sandia National Laboratories researchers Bruce Hendrickson and Rob Leland, as announced by the awards committee of the Supercomputing (SC) Conference 2014.

Hendrickson and Leland’s 1995 paper, ‘A Multilevel Algorithm for Partitioning Graphs,’ has been cited approximately 1,000 times, according to Google Scholar. The paper was credited for laying the groundwork for efficient graph partitioning — a method of dividing a large, complex problem among the processors of a parallel computer by creating increasingly smaller graphs that preserve the information of the large problem.

‘For your research community to say you’ve done something of lasting value is deeply meaningful to any scientist,’ said Hendrickson, now senior manager for extreme scale computing at Sandia.

Leland, director of scientific computing at Sandia, said he shared Hendrickson’s sentiment. ‘When you publish something, you don’t really know where it will go. It’s a nice thing that SC is doing, looking back over history to see how things actually worked out.’

The paper was considered one of the most influential contributions to high-performance computing. Its basic concept is still in wide use for solving problems concerning large data sets that cannot be described with equations, like keeping track of the country’s medical records or shipping containers voyaging around the world.

The core idea involved making increasingly smaller graphs by successive approximations, performing whatever task needed to be performed on the small version, and then unfolding it to its original size.

‘When you look at a graph, you can weight its vertices [representing objects, not angles] by their significance and their edges by how closely they are connected,’ said Leland. ‘You shrink the heavily weighted edges together, combine the vertices and achieve an accurate but simpler version of the initial, more complex problem. You just keep doing that, shrinking the problem but preserving the information, until the problem reaches a workable size.’

The researchers wrote in their paper abstract 18 years ago, ‘Experiments indicate that, relative to other advanced methods, the multilevel algorithm produces high-quality partitions at low cost.’

That basic approach became the dominant paradigm for a range of graph problems when it was implemented in an open-source Sandia code called Chaco, which magnified the impact of the algorithm.

‘In my view,’ he said, ‘this paper was an important milepost in the still-growing relationship between computational science —historically driven by engineers and physicists — and theoretical computer science, in which algorithmic ideas are essential to make effective use of parallel computers. As we look forward, even more sophisticated graph algorithms will be needed to mediate between the architectural details of these increasingly complex machines and the engineers and scientists who want to run simulations on them.’

Leland and Hendrickson have been invited to speak at SC14 in New Orleans in November.

Topics

Read more about:

Research

Editor's picks

Media Partners