NEWS

NOAA turns to Dell for supercomputer upgrade

Dell will be the vendor behind the latest upgrade to the US National Oceanic and Atmospheric Administration (NOAA), part of the US Department of Commerce. The upgrade will deliver additional computational power when two Dell systems are added to systems from IBM and Cray at datacenters in Reston, Virginia, and Orlando, Florida, later this month.

Once the upgrade is finalised NOAA’s combined weather and climate supercomputing system will be among the 30 fastest in the world, with the ability to process 8 quadrillion calculations per second.

‘NOAA’s supercomputers play a vital role in monitoring numerous weather events from blizzards to hurricanes,’ said Secretary of Commerce Wilbur Ross. ‘These latest updates will further enhance NOAA’s abilities to predict and warn American communities of destructive weather.’

This upgrade completes phase three of a multiyear effort to build more powerful supercomputers that make complex calculations faster to improve weather, water and climate forecast models. It adds 2.8 petaflops of speed at both data centres combined, increasing NOAA’s total operational computing speed to 8.4 petaflops — or 4.2 petaflops per site.

Dell is the latest addition to NOAA's weather and climate operational supercomputing system. This powerful Dell hums alongside NOAA's IBM and Cray computers at a data centre in Orlando, Florida. The three systems combined in Florida and Virginia give NOAA 8.4 petaflops of total processing speed and pave the way for improved weather models and forecasts. (NOAA )

The upgrade also adds 60 per cent more storage capacity, allowing NOAA to collect and process more weather, water and climate observations used by all the models than ever before.

‘NOAA’s supercomputers ingest and analyse billions of data points taken from satellites, weather balloons, airplanes, buoys and ground observing stations around the world each day,’ said retired Navy Rear Admiral Dr Timothy Gallaudet, acting NOAA administrator.

‘Having more computing speed and capacity positions us to collect and process even more data from our newest satellites — GOES-East, NOAA-20 and GOES-S — to meet the growing information and decision-support needs of our emergency management partners, the weather industry and the public’ added Gallaudet.

With this upgrade US weather supercomputing paves the way for NOAA’s National Weather Service to implement the next generation Global Forecast System, known as the ‘American Model’, next year.

The new GFS will have significant upgrades in 2019, including increased resolution to allow NOAA to run the model at 9 kilometres and 128 levels out to 16 days, compared to the current run of 13 kilometres and 64 levels out to 10 days. The revamped GFS will run in research mode on the new supercomputers during this year’s hurricane season.

‘As we look toward launching the next generation GFS in 2019, we’re taking a “community modelling approach” and working with the best and brightest model developers in this country and abroad to ensure the new U.S. model is the most accurate and reliable in the world,’ said National Weather Service Director Dr Louis Uccellini. 

Company: 
Twitter icon
Google icon
Del.icio.us icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Feature

Building a Smart Laboratory 2018 highlights the importance of adopting smart laboratory technology, as well as pointing out the challenges and pitfalls of the process

Feature

Informatics experts share their experiences on the implementing new technologies and manging change in the modern laboratory

Feature

This chapter will consider the different classes of instruments and computerised instrument systems to be found in laboratories and the role they play in computerised experiments and sample processing – and the steady progress towards all-electronic laboratories.

Feature

This chapter considers how the smart laboratory contributes to the requirements of a knowledge eco-system, and the practical consequences of joined-up science. Knowledge management describes the processes that bring people and information together to address the acquisition, processing, storage, use, and re-use of knowledge to develop understanding and to create value

Feature

This chapter takes the theme of knowledge management beyond document handling into the analysis and mining of data. Technology by itself is not enough – laboratory staff need to understand the output from the data analysis tools – and so data analytics must be considered holistically, starting with the design of the experiment