FEATURE
Tags: 

Life, the universe and computing

Robert Roe reports from the SC17 conference keynote, detailing progress on the Square kilometre Array project

The keynote for this year’s SC17 conference, the largest supercomputing conference in the USA, held this year in Denver, Colorado covered the progress made by the Square Kilometre Array (SKA) project. The presentation highlighted the need to leverage HPC to solve grand scientific challenges facing humanity.

On Tuesday 14 November two of the researchers for the Square Kilometre Array described the SKA’s international partnership, which will map and study the entire sky in greater detail than ever before in the conferences keynote address.

Underpinning the world’s largest and most powerful array of radio telescopes is a computing system that can process and manage the huge amounts of data generated – approximately 10 petabytes per day.

Philip Diamond, director general of SKA and Rosie Bolton, SKA regional centre project scientist and project scientist for the international engineering consortium designing the high performance computing systems used in the project, took to the stage to highlight the huge requirements for computation and data processing required by the SKA project.

‘SKA is an exascale science project that will push forward the boundaries of scientific endeavour and engineering capability for decades to come,’ stated Bolton.

The presentation then opened with a short video that described some of the ambitious scientific questions that could be answered based on data gathered from the SKA. What are gravitational waves? How does magnetism work throughout the universe? How are planets formed? What is dark matter? What is our history?

While the SKA cannot solve all of these challenges alone, the data it provides will enable researchers to study events such as the formation of planets and galaxies far into the history of the universe.

‘We are building a time machine. We are going to look at what our surroundings were like almost at their inception,’ said Bolton.

History of the SKA

The first concepts for the SKA began in September 1993 when the International Union of Radio Science (URSI) established the Large Telescope Working Group to begin a worldwide effort to develop the scientific goals and technical specifications for a large radio observatory.

The working group provided a forum for discussing the technical research required and for mobilising a broad scientific community to cooperate in achieving this common goal.

The two keynote presenters explained that the initial concept for the Square Kilometer Array was for a so called hydrogen array – a telescope sensitive enough to detect the signals from the dark ages of the universe 13 billion years ago.

In 1997, eight institutions from six countries; Australia, Canada, China, India, the Netherlands, and the USA signed an agreement to cooperate in a technology study programme leading to a future very large radio telescope. What followed was a long period of planning and developing agreements between the contributing countries and organisations leading to the creation of the SKA Organisation, a not-for-profit company established in December 2011 to formalise relationships between the international partners and lead the project.

‘Twenty-four years after the initial concept the SKA is an international project funded by 10 countries, bringing together more than 100 engineers and scientists from 270 institutions across 20 countries,’ added Bolton.

A square kilometre of collecting area

The SKA project is extremely ambitious in scope as its aim is to create a radio telescope array with a collecting area of one square kilometre. This will make the SKA the largest radio telescope array ever constructed, by some margin.

To achieve this, the SKA will use thousands of dishes (high frequency) and many more low frequency and mid-frequency aperture array telescopes.

Rather than just clustered in the central core regions, the telescopes will be arranged in multiple spiral arm configurations, with the dishes extending vast distances from the central cores.

In this array the physical distance between the telescopes is calculated precisely using the time difference between the arrival of radio signals at each receiver. Computers can then calculate how to combine these signals to synthesise something the equivalent size of a single dish measuring the width of the distance between the two scopes.

Using interferometry techniques the researchers can emulate a telescope with a size equal to the maximum separation between the telescopes in the array, or some smaller configuration based on a section of the full array.

Ultimately this means that, rather than build one gigantic dish, the capabilities of one huge dish can be surpassed by the flexibility that this interferometry configuration brings. The system can act either as one gigantic telescope, or a combination of multiple smaller telescopes.

Combining all of these signals is a huge task and this is where HPC hardware is needed to process and analyse the huge amounts of data that will be created by the SKA once it becomes fully operational.

Bolton explained: ‘A typical SKA map is going to contain hundreds of thousands of radio sources. Our iterative calibration and imaging process will use dataflow programming with 400 million points on the graph.’

This huge amount of data requires large scale computing infrastructure. ‘In total, the processing power we need in the SKA science and data processors is about 250 petaflops peak,’ said Bolton. The systems will also include around 80 petabytes storage and require 0.5-1 terabyte per second of sustained write to storage with sustained read rates approximately 10 times higher.

‘The incoming data sets are about 10 petabytes and our output 3-D images are 50,000 pixels on each access. That is 1,000 desktop hard drives one petabyte per 3-D image. This is real time, responsive high performance data analytics and to make it harder we need to operate day-in, day-out,’ concluded Bolton.

Once the project is fully operational the combined power of the array underpinning the SKA project will be able ‘to detect TV signals, if they exist, from the nearest tens maybe 100 stars and will be able to detect the airport radars across the entire galaxy’ Bolton said in reply to a question at the end of the keynote presentation.

Data for new scientific breakthroughs

The SKA is a global science project that could help scientists and researchers answer key questions relating to the nature and history of the universe, the formation of galaxies and planets and even explain the basis for magnetism and dark matter and energy. The SKA also enable astronomers to produce 3-D maps of the universe on an unprecedented scale.

‘Magnetic fields play an important role throughout the universe on scales as small as centimetres and as big as a billion of lightyears. With the SKA we hope to address the challenge of how and when magnetic fields arose and rose to their current strength and we will produce the first three dimensional kinetic map of the universe’ commented Bolton.

‘Dark matter and dark energy are ongoing huge mysteries and we plan to play a role in tackling them by studying galaxy evolution. Even in its deployment phase the SKA will be able to map 10 million galaxies spanning eight billion years of evolution. Once the SKA is fully deployed we will conduct the biggest ever galaxy census ever contemplated, in 3-D encompassing up to a billion individual galaxies and covering 12.5 billion years of cosmic history,’ Bolton added.

From this data astronomers will be able to make the most precise determination yet of the properties of dark energy driving the expansion of the universe. Without a project on the scale of SKA this research would not be possible as it allows astronomers to map galaxies much further away than previously possible. 

Other tags: 
Feature

Robert Roe explores the role of maintenance in ensuring HPC systems run at optimal performance

Feature

Robert Roe speaks with Dr Maria Girone, Chief Technology Officer at CERN openlab.

Feature

Dr Keren Bergman, Professor of Electrical Engineering at the School of Engineering and Applied Science, Columbia University discusses her keynote on the development of silicon photonics for HPC ahead of her keynote presentation at ISC High Performance 2018 

Feature

Sophia Ktori explores the use of informatics software in the first of two articles covering the use of laboratory informatics software in regulated industries

Feature

Robert Roe discusses the role of Pistoia Alliance in creating the lab of the future with Pistoia’s Nick Lynch