Supercomputer creates 3-D Arctic maps

The ArcticDEM project has released newly completed 3-D topographical models of Alaska – as part of a project to create the first high-resolution, high-quality and openly distributed set of digital elevation models (DEM) of all arctic landmasses above 60 degrees north.

The DEM maps were created in part using the BlueWaters supercomputer, housed at the University of Illinois’ National Center for Supercomputing Applications (NCSA).

‘We at Blue Waters are excited to expedite this analysis by providing Blue Waters’ unique mix of a balanced, capable system and flexible support services to the PCG team. It is a great example of what Blue Waters does to make grand challenge science possible on the order of a year rather than a decade,’ said Bill Kramer, Blue Waters director and principal investigator at NCSA.

The ArcticDEM project is a White House initiative to inform better decision-making in the Arctic. The project is being carried out by the National Geospatial-Intelligence Agency (NGA), the National Science Foundation (NSF), and the University of Minnesota’s Polar Geospatial Center (PGC), who are collaborating to use high-resolution satellite imagery from DigitalGlobe.

The data is then processed using NSF-supported Blue Waters supercomputer and the Extreme Science and Engineering Discovery Environment (XSEDE), with the aim of producing publicly available DEMs of the entire Arctic by the end of 2016.

Paul Morin, head of the University of Minnesota’s PGC, and his ArcticDEM project collaborators from Ohio State and Cornell universities, are creating high-resolution maps will provide a give scientists access to much more accurate data. Current elevation models for the Arctic have a resolution of one kilometre – Morin’s models offer a resolution of five metres or less, and are much more accurate at gauging height.

The ArcticDEM project feeds the data into Blue Waters to create the digital elevation models, using software tools developed at Ohio State University to automate the processes. The software takes parts of two images, tries to predict where one pixel is on the other, manipulates the images to minimise potential errors, and then creates a new image and measures the height of two by two-metre squares for all 20 trillion cells in the Arctic.

‘We can’t live without Blue Waters now,’ said Morin. ‘The supercomputer itself, the tools the Blue Waters team at NCSA developed, the techniques they’ve come up with in using this hardware. Blue Waters is changing the way digital terrain is made and that is changing how science is done in the Arctic.’

The production of these Arctic DEMs could transform the Arctic research community as they provide time-stamped observations of ice extent and ice surface height which can be examined within the context of changing environmental factors. This enables researchers to study the evolution of surface water flows on glaciers which can be examined down to the level of individual lakes and streams. The resulting research will impact not only scientists ability to track ice loss but also ecological conditions of arctic ecosystems, including wildlife management, and sustainability.

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

Robert Roe explores the role of maintenance in ensuring HPC systems run at optimal performance


Robert Roe speaks with Dr Maria Girone, Chief Technology Officer at CERN openlab.


Dr Keren Bergman, Professor of Electrical Engineering at the School of Engineering and Applied Science, Columbia University discusses her keynote on the development of silicon photonics for HPC ahead of her keynote presentation at ISC High Performance 2018 


Sophia Ktori explores the use of informatics software in the first of two articles covering the use of laboratory informatics software in regulated industries


Robert Roe discusses the role of the Pistoia Alliance in creating the lab of the future with Pistoia’s Nick Lynch