Skip to main content

Coming to a resolution

Pier L. Vidale, Professor at the University of Reading, UK, 
and a National Centre for Atmospheric Science (NCAS) senior scientist

'Our work on the Upscale Project began with an opportunity in Japan to use what was at the time the largest and most powerful supercomputer in the world, the Earth Simulator. Since then, our project has continued on different machines worldwide and 18 months ago we became aware of the availability of Tier-0 (petascale) machines within PRACE (Partnership for Advanced Computing in Europe). In the Upscale project we made a bid to run a global climate model with a resolution of 25km, currently that of the weather forecasting models. The idea of being able to run at that level of detail is very attractive as it means we can represent the phenomena seen in weather forecasting with the same fidelity. This requires the use of a big supercomputer, and there aren’t that many around the world. In the UK, there are systems located in Edinburgh (HECToR) and at the Hartree Centre (Daresbury) that come close to that level of capability, but it’s not quite there yet.

'PRACE enables projects like ours that require resources to which they wouldn’t otherwise have access. The separation of the Tiers serves to filter the different applications and ensure that the scientists who can only operate on a Tier-0 machine will be able to do so. Our bid was for 144 million core hours on the Cray XE6 system, Hermit, installed at HLRS Stuttgart in Germany, which has the same architecture as the HECToR system here in the UK. In an unprecedented allocation, we were awarded everything we asked for, which is the largest number of core hours ever to be made on a single supercomputer allocation worldwide. What this means is that we are now able to do work that will most likely not be possible in the UK for another five years. Hence we can also preview the future scientific possibilities afforded by improved modelling capability and larger computational resources.

'I believe our bid was successful for a number of reasons – a fundamental one is that there aren’t many groups that can effectively exploit these supercomputers. Our application is complete in that it makes use of all aspects of the machine, rather than just the computational power, interconnect speed or visualisation capabilities. Typical climate models use all of these resources very intensely. The difficulty we had was that we needed to optimise our application in a very short amount of time. When we moved to Japan it took nearly two years to adapt the UK climate model to work on that system and there were many technical hurdles that we had to overcome. The Upscale Project has a duration of one year and to use up all the allocated time, we had to port our application within one month. With much help from our technical support teams, we did it, but it was an incredibly challenging thing to do – especially given that we were deploying in another country with technicians and specialists we had never worked with before. Fortunately our model has developed since our time in Japan to make such work more manageable.

'The project is now underway and we are running studies of not only how the climate system works, but also in hypothetical future climates where the emissions are very high and the atmospheric composition has been radically altered. In order to get the level of detail we want, we will have to increase resolution even beyond what has been possible in Upscale. We use parameterisations, which are ways to bridge the scale between the phenomena we are trying to represent and the computation grid that we’re able to apply to the problem, and if in the future we can remove the need for some of these parameterisations, the fidelity of the models will be higher. This would require the use of hundreds of thousands or even millions of cores concurrently, which presents a vast technical challenge.

'For some, it will require a recoding of all the models, which, while feasible, is a significant effort. A further issue is that it’s hard to predict what the computing architectures will look like in the future. In many ways, everyone is currently trying to develop codes that will work on general architectures. The code may not be as efficient, but at least our model development and operation will be more sustainable.'

Robert Jacob, Argonne Computational Climate scientist, US

'Led by Dr Warren Washington, a senior scientist at the US National Center for Atmospheric Research (NCAR), the Climate Science Computational End Station (CCES) project is a collaborative effort between the US Department of Energy (DOE), NASA, the US National Center for Atmospheric Research (NCAR) and a selection of universities. With the goal of improving climate models on several different fronts, such as increasing their accuracy and fidelity, the CCES project has requested a large block of time on the DOE’s computing facilities in order to accomplish this work. Here at Argonne National Laboratory, our specific focus for the project is on the issues involved in running climate models at a higher resolution.

'Improving the resolution is a very important area of research, because if we are to get predictions on a finer scale – on county levels, for example – then we have to increase the number of points used to cover the globe. There are, however, many factors that need to be addressed in getting to that stage, such as the fact that models become far more expensive to run, greater computing power is needed and the software and performance issues that need to be dealt with when simulations are being run on tens of thousands of processors. Many climate models also have parameters that must be tuned in order to make the resulting simulation look like the observable climate, and these tuning exercises are very difficult to do at current resolutions, because they take up a considerable amount of computer and researcher time.

'We are solving these problems – albeit very slowly. The easiest starting point is to look at the raw performance of the model. The one figure that all climate modellers care about is the number of simulated years that can be done in a day, and there is a long-standing rule of thumb in the climate community that says if you can get five years per day, you can progress the research at a comfortable pace. Of course, it’s great if it can be done faster, but we are limited in the amount of science we can do if the model is slower than five years per day.

'When attempting to increase the performance of the models, we can judge how well we’re doing by running very short simulations which, by using the statistics and information gained from the computer, let us know where there may be room for improvement. By putting timers around sections of code that we think may be slowing the models down, we can figure out where exactly the problem might be and then possibly turn our attention to developing a faster algorithm. That is a relatively straightforward process, but it still takes a large amount of time given that it needs to be done in many short runs. Once we’re satisfied with the raw performance of the model, we can begin to think about the parameters.

'Things are going quite slowly at the moment because while Intrepid, the BlueGene/P supercomputer at Argonne, has 200,000 cores, we need almost all of them in a single run at high resolution. Tuning the parameters has become a process of trial and error, but with Mira, the new IBM BlueGene/Q supercomputer coming online, we can do several runs at the same time which means we’ll arrive at a good solution much faster.

'Another key area into which the project is putting a great deal of effort is the addition of more realistic processes. Looking at the development history of climate models, there has not only been an increase in resolution, but also more of the physical processes occurring in the climate system are included. The earliest models simply considered the atmosphere, while the ocean was just a fixed temperature and surface. Now we include chemical reactions that occur in the atmosphere and affect cloud formation and ozone formation. In future models, we’re looking at including both many of the chemical species in the atmosphere and any equations for how those species interact. Climate models will also include the behaviour of the biosphere; plants and plankton have a big influence on the carbon cycle and if we want to understand our own impact on that cycle, our models need to represent the natural carbon cycle.

'Science proceeds through the interplay between theory and observation, and finding climate-quality observations, which means having a good spatial coverage and very long time length, is incredibly difficult. As a community, we are actually quite concerned that there’s going to be a significant gap in satellite coverage as many of them are about to go offline. There may be a lowering in the amount of information and data we receive until replacement satellites are deployed, but there aren’t many of them waiting in the wings. Unfortunately, this is one of those issues that we will have to deal with when and if it happens.'

Donald J. Wuebbles, The Harry E. Preble Professor of Atmospheric 
Sciences at the University of Illinois, US

'What used to be termed ‘climate models’ are now being viewed as Earth system models, because of the level of detail they provide. Instead of focusing on the atmosphere and oceans, these models now include representations of all the different processes that affect the climate system, such as atmospheric chemistry and ice interactions. The models don’t always do this successfully, and we will certainly need to improve them further during the next decade, but they are definitely off to a good start. Current models that have completed runs for the next international assessment are grouped together under CMIP5, the Coupled Model Intercomparison Project Phase 5. We have been analysing around 30 models and while they have, on a general level, improved since the 2007 United Nations Intergovernmental Panel on Climate Change (IPCC) report, the basic findings are pretty similar.

'The lack of some of the major changes within the models that we had hoped to see comes down to the need to get everything completed in time for the current assessment. It is always a little disappointing that we aren’t able to take all of the major steps forward, but time constraints have meant that proposed changes weren’t made or that models weren’t run at as high a resolution as we would have liked. Nevertheless, we are making progress and getting closer to fully representing Earth systems. As we are doing so, the basic message of the concerns about climate change remains the same – this is one of the biggest issues confronting humanity this century.

'Speaking for here in the US, the models generally do a great job. The representations of the Earth do indeed look like the Earth, the temperature changes are about right and the average precipitation changes are fairly accurate. Where the models fall down, however, is in dealing with extremes. Most models, for example, fail to predict enough change in extreme precipitation. That is a little worrisome, but our focus now is on identifying those models that do offer these predictions, as well as all the other climate aspects. We expect to find four or five models that fall into this category, but most – especially when getting beyond two sigma – aren’t there yet. Of course, the climate community is still trying to determine how to align the models.

'In some cases, the issue can be as simple as the fact that the model isn’t treating radiative transfer accurately (the absorption, reemission and scattering of radiation from the Sun and from the Earth). Five years ago, a comparison was made of how a variety of models treated atmospheric radiative processes, and the results were very surprising in that many of the models were not very accurate. We also know that when it comes to aspects such as sea ice and land ice, current models aren’t capturing the processes correctly. Within the next five years, we hope that the relationships between ice on land and in oceans, and the effects of atmospheric particles, will all be greatly improved. One thing we are already beginning to see is that the treatment of surface hydrology and how models represent what’s happening with rivers and lakes are getting better. Beyond that, we want to see models reflect the impact of agriculture on the land and, also importantly, how urban environments affect the climate system.

'With our research partners at NCAR (The National Center for Atmospheric Research), we are beginning to do runs on the new petascale machines, such as Blue Waters at the University of Illinois, and even without some of the improvements fully represented in the model yet, such as the treatment of ice, I do believe there will be an overall much better representation of the Earth. Higher resolutions that are as fine as 25km will provide us with far more detail on regional and local scales than is currently available with global climate models, but they do come with their own sets of problems. Initial runs at NCAR found that while these higher resolutions did improve a lot of aspects, they also found that subgrid parameterisations needed to be further improved.

'The core of the models are changing and moving away from the linear grids of the past and going to grids that are better able to handle the message passing and relationship between one node and another. In terms of the core structure of the models, this will be a substantial transition. The question is what type of runs we will be doing ten years from now and how we should be designing our codes in preparation. There is a lot of uncertainty when it comes to exascale and over the next year we want to do a run with the NCAR Earth system model to look at using the proposed coding aimed at changes needed for running on an exascale machine.'



Interviews by Beth Harlen

Topics

Read more about:

Modelling & simulation

Media Partners