Society spoilt for supercomputing choice, Ovum says

Wolfram Research’s recent move into the cloud computing field and Microsoft’s Windows HPC Server 2008 release are indicators that it’s not just the scientific community that has ample opportunity to use HPC resources, according to David Mitchell, SVP of IT research at analyst house Ovum.

Using Nimbus Services and R Systems, Wolfram is creating a service that allows users of the Mathematica product line to tap into additional high-performance computing (HPC) services in the cloud. The models that Mathematica users create are typically very computationally intensive, taking hours or even days to execute, making them ideal consumers of cloud-based HPC services – provided they are constructed to take advantage of highly parallel and distributed computing facilities.

The introduction of cloud services by Wolfram is another demonstration that HPC has come of age, according to Mitchell, who adds: ‘This follows the wider push that Microsoft made into the HPC arena with the release of Microsoft Windows HPC Server 2008 in September 2008. Both provide an infrastructure to support HPC computing for package applications – Mathematica in the first case and applications like Excel and Sharepoint in the latter – plus more general-purpose applications that can be developed with Visual Studio.’

HPC was once available only to the scientific community, and the ‘quants’ in the financial services industry. It required specific custom development to be able to take advantage of multiple processors and multiple computers, utilising the MPI communications protocol (messaging passing interface) as a software enabler. Facilities cost millions of dollars and were extremely costly and time consuming to create, many being funded by government research grants. Nowadays the supercomputer market is much more approachable, with the entry-level Cray CX-1, running Windows HPC Server 2008, having a retail price of $25,000, according to Ovum.

Mitchell added: ‘HPC is also being driven by the increased volume of data becoming available and desire from business to be able to generate insight from that data. Data is increasingly becoming available from core transactional systems, as well as the huge volumes of sensor-based computing such as satellite imagery, GPS data or RFID. Huge volumes of computing resources and sophisticated analysis tolls are needed to try to extract business insight from these data sources.’

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe looks at the latest simulation techniques used in the design of industrial and commercial vehicles


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers