HPC chips work behind the scenes
Originally, high-performance computers were housed in large, imposing supercomputing centres, such as the, the magnificent Torre Girona, an ex-chapel that houses the Barcelona Supercomputing Centre’s MareNostrum supercomputer.
More recently, however, it has become clear that for some applications at least, supercomputers could be installed on site at different institutions, and some companies are even producing multi-core systems that allow scientists to access high processing powers from their desktop.
Now, it seems that a new trend is emerging. BAE Systems has decided to integrate high-performance computing chips into its remote sensing satellites; and Diamond Light Source has installed a new high-performance computing facility to automate data processing from its synchrotron.
In the same way that conventional processors can be found in everyday appliances with little human interface, it seems chips are now working behind the scenes to process data before it even reaches the eyes of the scientists conducting the research.
The chips used by BAE Systems will make use of multiple processing cores to process large volumes of data collected by the satellite at high speed. The multiple processors work in parallel to increase the speeds at which data can be processed.
The individual processors actually run at lower clock speeds than conventional chips, but when they work together it gives more processing power. This increases the energy efficiency of the chips, and makes them easier to cool – two requirements that are essential for space applications. The chips were licensed from ClearSpeed Technology.
'In the past 10 years power consumption doubled [in conventional chips], and they are very power hungry’ Tom Beese, CEO of ClearSpeed Technology, told scientific-computing.com. 'BAE needed very high powers for high volumes of data, with accurate processing and a very high power efficiency.'
Rather than supplying the processors to BAE, ClearSpeed has licensed them instead, so BAE can make the necessary adjustments for the requirements of space, which include big changes in temperature and intense radiation.
Beese says this is the first time ClearSpeed's processors have been embedded in an environment other than the normal computing space, so it could be a signal that HPC processors will find new applications onboard other systems in rugged conditions.
'The tech for HPC is reaching a broadening market,' he says. 'It could be embedded in specialist systems in medical imaging or sensing, where it isn't trying to be a computer.'
In a similar manner, Diamond Light Source has recently installed a high-performance computing cluster to automate a lot of the data processing from its synchrotron – a type of particle accelerator that produces very bright x-rays, used to study molecules in structural biology.
Visiting scientists use the site to produce diffraction patterns that provide information about the molecular structure of proteins. However, transforming this data into a 3D representation of the molecules requires a lot of processing. In addition, Diamond Light Source is constantly upgrading its synchrotron to produce brighter and brighter beams, which produces even more data that needs to be analysed.
In the past, this processing had to be arranged by the researchers themselves, often once they had left the facility. Now that Diamond Light Source has the HPCC facility, a lot of this processing can be done as it is collected. It is currently working on software to automate this procedure, which they hope should be possible by the end of this year.
'The computing's there, we just need time to integrate the software,' Alun Ashton, a data acquisition scientist at Diamond Light Source told scientific-computing.com. 'The processing can be done a lot quicker, while the user's doing the experiment, so they can change the experiment, and do something different if they need.'