Skip to main content

Let's talk about the weather

To talk about the weather is as English as afternoon tea and scones. The phrase ‘what a wet winter we’re having!’ is an instant ice-breaker in any social gathering in the UK. But will HPC change British behaviour? While HPC may not be a topic of discussion at bus stops, it has a vital role to play in predicting the weather. Using HPC, meteorologists can model climates (the underlying conditions) and predict weather (the atmospheric conditions at a given time) with a high degree of accuracy. Deprived of uncertainty in their weather, will the British stop talking to each other completely?

Extreme weather is more than just a talking point. It can cause accidents, injury and death. No one can prevent extreme weather, but they can prepare for it – and improved prediction is critical to improving preparation. The demand for better weather prediction is driving large investments in meteorology, and the HPC infrastructures that the weather and climate modellers require. Srini Cari, an IT analyst with New York-based Cabot Partners, has analysed the market for accurate weather prediction on behalf of IBM. According to Cari, weather variations in the US alone have been estimated to have annual monetary impacts as high as a trillion dollars. Official estimates for the costs of hurricane Katrina, for example, have run as high as $200bn. Further to this, he says, weather affects the financial markets; the prices of energy futures in particular fluctuate with the weather forecasts.

The sophistication of the science of meteorology, and therefore of the simulations and models underlying it, can be seen in the accuracy of the forecasts: ‘The accuracy of absolute weather forecasts, out to a few days into the future, has increased over the last two or three decades from about 50-60 per cent accuracy – flipping a coin essentially – to today, where a two-day forecast is correct 95 per cent of the time,’ says David Blaskovich, weather segment leader at IBM Deep Computing, based in Pacific Grove, California. ‘This is a remarkable achievement when you think about it, especially considering how chaotic the atmosphere is,’ he says.

Weather is often used as the definitive example of a chaotic system. This chaos does impose limits on forecasting: ‘In terms of absolute forecast in weather, there’s a theoretical maximum range of about 14 days into the future before chaos takes over,’ explains Blaskovich. ‘There’s so much chaos in the atmosphere, and we’re simulating the behaviour of a few molecules.’

How has computing power increased the accuracy of weather forecasting? Shai Fultheim is founder and president of ScaleMP, a Cupertino, California-based shared-memory systems specialist with customers in the meteorological sector. He explains that weather forecasters tend to use one of two standard modelling applications to simulate the atmosphere, these being WRF (Weather Research and Forecasting model, sometimes pronounced ‘warf’) and MM5 (short for Fifth-Generation NCAR/Penn State Mesoscale Model). Meteorologists have increased accuracy of forecasts produced by these models by broadening the range of parameters included in the simulations.

One such parameter is that of the oceans. IBM’s Blaskovich says: ‘The real breakthrough came by making predictions of the oceans, which are much less chaotic than the atmosphere. Oceans have a huge influence in driving atmospheric behaviour.’ Predictions based on ocean simulations began to be made in the 1990s, and Blaskovich says that since then the techniques have been developed to be more detailed, with simulations now achieving high accuracy. When it comes to weather forecasting, an ocean simulation is run out weeks, days, months – and, in some cases, years into the future. The results of this ocean simulation form the boundary conditions for weather predictions; an atmosphere is built on top of predicted ocean states and these states drive the atmospheric simulations for a given time in the future.

The understanding of the interaction between ocean and atmospheric systems has resulted in what Blaskovich describes as ‘a unique and accurate way to predict seasonal changes in the weather’. He adds that ocean predictions form a core element of longer-term climate predictions, out to 50 or 100 years, but the accuracy of these predictions will not be known for some time to come.

Nor’easters

Oceans are the drivers of unusual meteorological phenomena, but weather can also be the force behind unusual ocean behaviour. Large storms are a common occurrence on the east coast of the United States, driven by cold air from the Arctic meeting warm air from the Gulf of Mexico. The storms are typically centred just off the coast, and their cyclonic nature means that the wind on land is felt to come from the north-east – hence the common name for these storms is ‘Nor’easters’. During a storm, a combination of high winds and low air pressure sometimes raises the sea level by several metres, flooding coastal areas and bursting river banks inland. Statistically, these so-called storm surges are the main cause of damage and death during large storms and hurricanes, and they are particularly destructive when they coincide with a high tide. Researchers at the Renaissance Computing Institute (RENCI) of the University of North Carolina are using the institute’s Chapel Hill facility’s new Intel Xeon 5500 cluster to model storm surges in coastal areas vulnerable to hurricanes, such as the Atlantic coast of the centre’s home state of North Carolina.

The RENCI team are using a coastal circulation and storm surge model called ADCIRC to build new floodplain maps. These maps will be used by the US Federal Emergency Management Agency (FEMA) to plan their emergency response, and also by insurance companies to assess flood risks. Brian Blanton, senior research scientist for oceanography at RENCI, believes that better modelling of these coastal phenomena will save lives.

The ADCIRC model allowed the group to accurately map the North Carolina coastline, without over-resolving parts of the system that were not relevant. Nonetheless, the datasets remain large. ‘To understand future storm surges, you have to run a lot of simulations on a very high-resolution grid,’ explains Blanton. ‘That is the resource problem that we’ve run into over and over again with very high-resolution grids, with not only the North Carolina project that we’re doing now at RENCI, but also previous projects based on the New Orleans region, where there are thousands of miles of levees that have to be represented on a numerical grid with enough accuracy to be confident that the solutions are reasonable and accurate.’

Currently, the storm surge models run at a minimum length scale of 25 to 50m. Blanton states that, although the model could handle smaller length scales, potentially down to as low as five metres, this level of resolution would not be useful due to how long the simulations would take to run. Brian Etherton, senior research scientist for atmospheric science at RENCI, elaborates: ‘I think the biggest problem that we face is the resolution of the grid... meaning the input data, and how detailed it is. For example, a new grid was produced for North Carolina over the past 18 months at double the previous resolution. The new resolution tops out at 256 cores on the new cluster. Even though we have 1,026 cores available, on any given single run we cannot make adequate use of those cores – simply because the resolution of the grid does not contain enough data to keep the system fully operational and to maximise throughput. The production of that grid is a very human labour intensive process – it’s an 18- to 24- month process just to double its resolution.’

Etherton contrasts simulations like this, based on a complex real-world-accurate grid, with simulations such as those used in high-energy physics: ‘We can’t just keep scaling out to the Nth level. Our work is about running ensembles – multiple runs, each with slightly different input data – to get many slightly different results. We then compare the statistical differences between those results,’ he says. This ensemble approach allows the team to optimise the use of their cluster: ‘Although we have 1,024 cores, we can only adequately use about 256 cores per run right now. But we’re able to do four runs at once, so we can get more data, and then look statistically at how accurate that data is.’

Alongside, and related to the storm surge work, the group simulates local weather to a resolution of 4km. The team say that this resolution is already approximately eight times finer than that at which government meteorologist simulate the weather, and does not envisage reducing this resolution further in the near future. Additionally, they estimate that redoubling the resolution of the grid on which the simulations are based would take three to four years. The researchers are quick to point out, however, that they would always have use for increased compute resources.

Both Etherton from RENCI and Blaskovich from IBM agree that the future of weather prediction relies upon incorporating a greater number of small systems into the models, which becomes possible as computer speed increases. Blaskovich believes that, as well as oceans and atmosphere, land processes, addition or removal of vegetation, the hydrological cycle, reactive gases in the atmosphere and many more effects can be introduced as each is discovered and validated on an individual basis. Blaskovich cites clouds in particular as having an important effect on the atmosphere, and one which he says is poorly modelled by current mainstream techniques. The tops of clouds reflect sunlight, which increases the Earth’s albedo, therefore having a slight cooling effect. On the other hand, IR radiation from the lower atmosphere is absorbed by water vapour in clouds, causing a slight warming effect. Blaskovich believes that ‘breakthroughs’, in terms of understanding these contrasting effects, are imminent. ‘This requires very high-resolution, very detailed simulations,’ he says, adding that it is the new generation of more powerful computers that will be an enabler of this sort of understanding.

Blaskovich explains that, because researchers are limited in how much computer power they have with which to run atmosphere-earth simulations, some processes simply cannot be included. Where this is the case, they’re estimated or parameterised. More capability allows the researchers to simulate the actual behaviour in the atmosphere reactively.

Applications for dealing with clouds are a relatively recent addition to the models: ‘I would say that, within the last five years, it was realised that these were some of the most significant missing effects in weather simulations, and a great deal of new money has been directed to the scientists who have been working on the parameterisation and the actual numerical simulation of clouds,’ says Blaskovich, also noting that funding in the USA comes primarily from the National Science Foundation (NSF).

Glaciologists at the University of Swansea are modelling the behaviour the Greenland ice sheet in order to understand its effect on weather and climate systems.

Big iron for big problems

The researchers from IBM and RENCI agree that increased computing power allows more complex interactions to be simulated with more realism at higher resolutions. It’s hardly surprising, therefore, that meteorologists are already being lined up to make use of the NCSA’s ‘Blue Waters’ sustained-petaflop supercomputer. Blue Waters is currently under construction at the University of Illinois at Urbana-Champaign. The machine is expected to be the fastest supercomputer in the world when it comes online in 2011.

The NSF is making funding available to allow researchers to prepare projects for the machine, and two of the ‘Petascale Computing Resource Allocations’ (PRAC awards) have already gone to meteorological groups. One group, led by Benjamin Kirtman of the University of Miami, will test hypotheses about climate prediction at ‘unprecedented’ resolutions. The other, lead by Professor Robert Wilhelmson of the University of Illinois, will look at tornadoes.

Professor Wilhelmson explains the aims of the nascent project: ‘Currently we’re able to simulate tornado-like vortices in storms, but the kind of resolutions we’re working to are approximately 100m, and that’s not quite sufficient to capture the details of the low-level inflow into these tornadoes,’ adding that these details exist at a short length scale between 20m and 30m.

According to Wilhelmson, tornadoes are most commonly the result of a specific type of storm known as supercells. A normal storm creates updrafts, which live for between 45 minutes and one hour before decaying. In a supercell storm, however, these updrafts may persist for several hours. The biggest, most destructive tornadoes exist where areas of upwards- and downwards-moving air co-exist. A tornado’s funnel may be less than 100m in diameter, and so the resolution at which the supercell system is simulated must be high if these phenomena are to be understood.

Increasing the resolution of a simulation such as this is not the end of the story, however. As the length-scale of the simulation is reduced, the time-step between computations must also be reduced. ‘If we double the resolution in each direction... then, at the minimum, we then have to use a time-step which is less than half the duration of its predecessor,’ says Wilhelmson. For example, scaling from 100m to 10m resolution, the simulation time-step must be reduced by a factor of 1,000, meaning that a 10-fold increase in resolution leads to a 100,000-fold increase in computation events.

Wilhelmson’s group have found that hardware alone is not sufficient to meet these demands, even with the power that the new system offers, and so they’ve had to look carefully at their code.

‘Traditionally in meteorology, we have optimised codes for reading [by programmers], and for ease of changing, but what we’re saying now is that when we use a powerful machine like this, even though it is much faster than machines that we’ve previously being using, we can get more computing done by paying more attention to the code structure,’ he says.

Downsizing

The models being prepared for Blue Waters represent some of the most demanding in the world in terms of compute resource, but other groups are showing that real contributions to atmospheric science can be achieved using relatively modest HPC infrastructure. In Pembrokeshire, Wales, users of the ‘Blue Ice’ cluster aim to improve upon components of climate models, rather than whole systems. The 80 dual-core processors of Blue Ice, installed and maintained in the Pembrokshire Technium by UK integrator OCF, are being used to model the effects of ice sheets on global climate and weather.

There are currently only two ice sheets on the planet, covering Greenland and Antarctica. Professor Tavi Murray, scientific director of the Mike Barnsley Centre for Climate Research (which makes use of Blue Ice), explains their relevance to climate: ‘Ice sheets are hugely important to climate systems for a whole range of often-overlooked reasons: Firstly, they’re white, so they reflect back a large proportion of the energy that hits the Earth’s surface. Furthermore, a change in the amount of ice cover leads to a kind of positive feedback, so if you take the ice away the surface of the earth gets warmer more quickly than it would otherwise.’

In the same way that better understanding of the oceans increased the accuracy of weather forecasts, a deeper understanding of ice flows in polar ice sheets could improve climate predictions. Murray’s colleagues, also using Blue Ice, are working on similar tweaks, such as better representation of vegetation (which affects albedo and CO2 exchange), and incorporating the effects of cities into models.

While the weather may not be a trillion-dollar question for everyone, atmospheric scientists agree on its importance in our society: ‘Imagine what would happen if the UK Met Office missed tomorrow’s forecast,’ asks RENCI’s Brain Etherton. ‘It would be in the paper, and you’d hear about it very quickly.’ For this reason, he says, governments around the world continue to investment into meteorological research, not only to develop and support the next generation of techniques, but also to ensure the sustained performance of current weather prediction, on which many people rely. It looks to be a safe topic of conversation for the British for some time yet.



Topics

Read more about:

HPC

Media Partners