Cray optimistic though profits dip

Cray was bullish in its presentation of financial results for the second quarter, following a series of recent contracts awarded to the company.

Revenue for the quarter was $85.1 million, compared to $84.5 million in the prior year period, an increase of $0.6 million. However revenue did not translate into profits; the company reported a net loss of $6.7 million, compared to a net loss of $0.2 million, for the same period in 2013.

However, the company pointed out that the nature of its business, with large contracts that can span multiple years, can skew the financial results and it expects much better performance in the fourth quarter.

Peter Ungaro, president and CEO of Cray said: ‘Over the last few months, we’ve been on an incredible run of customer wins in both the US and around the world. These awards, some of them multi-year in nature, reaffirm our belief that we’re in a great position to continue to grow, not only in 2014 but also over the coming years.’

Two such contracts were awarded to Cray in July. Firstly a $174 million deal to provide the US National Nuclear Security Administration (NNSA) with a Cray XC supercomputer and a Cray Sonexion storage system - one of the biggest contracts ever awarded to the company. The second was a contract to provide the Tata Institute for Fundamental Research (TIFR) in Mumbai with a Cray XC30 supercomputer – the first Cray system to be deployed in India.

The system, named ‘Trinity’ by the NNSA, is a joint effort between the New Mexico Alliance for Computing at Extreme Scale (ACES) at the Los Alamos National Laboratory and Sandia National Laboratories as part of the NNSA Advanced Simulation and Computing Program (ASC). The new Cray system will be used to ensure the safety, security and effectiveness of the United States' nuclear stockpile - using advanced simulation to model the way in which the nuclear stockpile is ageing.

The TIFR will deploy a Cray XC30 system will be used by a nation-wide consortium of scientists called the Indian Lattice Gauge Theory Initiative (ILGTI). The group will research the properties of a phase of matter called the quark-gluon plasma, which existed when the universe was approximately a microsecond old. ILGTI also carries out research on exotic and heavy-flavour hadrons, which will be produced in hadron collider experiments. The Cray XC30 will be the first supercomputer located in ILGTI's new facility in Hyderabad.

Andrew Wyatt, Cray vice president, APMEA said: ‘The researchers and scientists at TIFR are running highly-complex Lattice QCD workloads, and we are honoured that India's first Cray XC30 supercomputer will power the Institute's important and challenging research. TIFR's work with theoretical physics and quantum chromodynamics is an ideal fit for the Cray XC30 system, which is designed to execute highly-advanced numerical computations with superior scalability, performance and reliability.’

In June the company was awarded a $54 million contract to provide the Korea Meteorological Administration (KMA) with two next-generation Cray XC supercomputers and a Cray Sonexion storage system. In this case the Cray hardware will be used to provide more accurate weather forecasts through increased model resolution, new forecasting models, increased ensemble sizes, and the implementation of advanced data assimilation.

Kyung Heoun Lee, Director of the National Center for Meteorological Supercomputing at KMA said: ‘KMA is responsible for providing services for the protection of life and property in the form of weather and climate information. We live in a weather sensitive environment, and people and businesses increasing rely on us for accurate environmental forecasting. Our new Cray supercomputers will be a valuable resource for us to meet our strategic, operational and research objectives.’

Another announcement came in May when the Center for Computational Sciences (CCS) at the University of Tsukuba in Japan picked Cray to provide a new supercomputer for the centre. With the announcement of the new Cray CS300 system, named ‘COMA (PACS-IX),’ which stands for Cluster Of Many-core Architecture processors, and the previously announced High Accelerated Parallel Advanced system for Computational Science (HA-PACS), the University now has two petascale Cray cluster supercomputers.

‘At this moment, COMA is the largest cluster system in Japan to employ Intel Xeon Phi coprocessors,’ said Professor Taisuke Boku, Chair of the Administrative Committee for Computer Systems at the Center for Computational Sciences at the University of Tsukuba. ‘We are focused on accelerated computing technologies for scientific computing. It is quite interesting to now have the ability to research and compare the performance characteristics of two different types of accelerators -- the GPUs on our HA-PACS system and the Intel Xeon Phi coprocessors on our COMA system.’

The NNSA system is not expected to be completed until 2015/6, similarly the system at the KMA is also expected to be finalised and deployed in 2015. With these long term contracts in place Cray is understandably optimistic about its future growth in the coming years. Ungaro said: ‘We believe we’re in a great position to capitalise on this exciting market evolution and to continue to build on our market leadership.

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process


Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory


This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.


This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value


This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment