PRESS RELEASE

Cray framework for Hadoop

Cray has announced a new framework designed for ‘big data’ that gives Cray customers the ability to implement and run Apache Hadoop easily on their Cray XC30 supercomputers.

The Cray Framework for Hadoop package includes documented best practices and performance enhancements designed to optimise Hadoop for the Cray XC30 line of supercomputers. It is aimed at giving users the utility of the Java-based MapReduce Hadoop programming model on the Cray XC30 system, complementing, HPC-optimized languages and tools of the Cray Programming Environment.

The initial release of the Cray Framework for Hadoop and an optimised Cray Performance Pack for Hadoop will be available as free downloads. They include validated and documented Apache Hadoop configurations. This performance pack includes Lustre-Aware Shuffle to optimise Hadoop performance on the Cray XC30 supercomputer.

Further enhancements to the performance pack, which will include a native Lustre file system library and a plug-in to further accelerate Hadoop performance using the Aries system interconnect, will be available in the first half of 2014.

 

Company: 
Feature

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori

Feature

Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware

Feature

Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community

Feature

Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers