The future of currentsTweet
Climate change and global warming are ever-present topics in the media, but how much can the predictioncs be trusted? David Robson investigates the trials of modelling such complex environmental and meteorlogical data.
Tuvalu, a group of islands in the Pacific and the second-smallest country in the world, may no longer exist by the end of this century. With virtually no industry and no cars, it produces less carbon pollution than a small American town. Yet a 50cm rise in sea level would almost completely destroy it, displacing 11,000 citizens, along with thousands of other inhabitants from low-lying islands such as the Maldives and Kiribati.
A one-metre rise, and the impact would be even more drastic. An estimated 15 per cent of Egypt’s arable land would be lost, along with 6 per cent of the Netherlands, and 17.5 per cent of Bangladesh. Some 72 million people in China would be affected. That’s not to mention the damage caused by ‘startling trends’ in storms and hurricanes, described by one scientist interviewed for this feature.
The trouble is, scientists don’t know what the change will be, with reasonable estimates ranging everywhere between the two sea level figures. Predictive models must be able to capture the sweeping global cycles of the thermohaline circulation churned by the ocean currents, the dynamics of landscape, the impact of volcanic eruptions and human activity, right down to the molecular processes that trap a small fraction of the greenhouses back in the ocean. Just trying to keep track with the different ingredients can make one’s head swirl, let alone projecting it across the 510m km2 of the Earth’s surface, 100 years into the future.
It’s a Herculean task, even for the world’s most powerful supercomputers, but a necessary one nonetheless. Scientists, activists and politicians are hungry for evidence either for or against climate change, and the hunt has generated some ingenious solutions.
Like almost every computer model, the globe is simplified and split up into a grid, and the relevant equations are solved at each point. Even this creates complications. Most modern supercomputers would buckle under the strain of a resolution finer than 100km, but this is already too large for the relatively small ocean currents like the Gulf Stream, measuring just 80km in width at some points. In addition, flows in the deep ocean, and in shallow coastal areas, need to be examined differently.
To solve this, Professor David Marshall and his team of physical oceanographers at the University of Oxford, have developed a new generation of models that can adapt the size and shape of the cells used in the model, as the fluid evolves.
‘It’s very much blue sky research, and we feel it could have a big impact on ocean modelling,’ says Marshall. His team are studying the large-scale circulation of the oceans, and how it is changing due to global warming. In the past, thermohaline circulation, which moves hot water from the equators to the poles and back again, halted as a result of a rapid transition from cool to mild climates. If this occurred again, it would have catastrophic effects for all of the world’s climates. Some scientists predict it could starve the oceans of oxygen, leading to mass extinction of sea wildlife.
The lack of experimental observations surrounding the research, however, is seriously reducing the power of these simulations. It is an unfortunate paradox that temperatures have never risen in such a way since the end of the last Ice Age, making the predictions necessary in the first place, but the computer models themselves are only as accurate as the data that is fed into them.
It’s a fact that sceptics have jumped on, often mistakenly. Just last month, scientists at the National Centre for Atmospheric Research (NCAR) in America found that arctic ice is actually melting even faster than models predict, because the processes involved were far more complex than had ever been imagined.
The NCAR’s models take into account a seemingly endless list of inputs, including a sophisticated method of coupling the separate models for land, ocean, and the atmosphere. They are three-dimensional, and they account for vegetation and different types of land surface. The level of detail is enormous, with solutions for every hour of the predictive time frame, which could last for more than a century.
However, what the scientists failed to predict was the lubricating effect of water melting from the icebergs, which percolates down to the bedrock and allows them to slide out into the warmer water to break and melt. Equally, they had no way of knowing that these icebergs were preventing the flow of glaciers into the warm sea, which are now releasing their water into the oceans at a much quicker rate. With new research in Greenland feeding these factors into the predictive framework, it is hoped the models will more fully explain what is happening in nature.
Says Jerry Meehl, of the NCAR: ‘The pace is more rapid than ever before. We’ve got the difficulty of modelling something that we’ve never observed. We try to reconstruct past ice ages, but the evidence is limited. We can only indirectly infer results, but we are beginning see a big instability in the ice sheets.’
If it seems like an uphill struggle, recent developments in high-performance computing are providing some relief. ‘The models are of such a high level, and so complex, that the fastest supercomputers are needed,’ continues Meehl. ‘It’s massively faster than it ever has been; we can now model 10 years in one day.’ However, the storage and access of data is still a problem.
With these advances, it would be easy to assume that in time, with a deep enough understanding, and more powerful computers, the models could effectively create a whole world on a desktop. Meehl and Marshall, however, are keen to emphasise that the models are just a tool to provide a deeper understanding of the processes involved; they can’t provide the reliability you may expect when modelling a car, for example.
‘I worry that people used to understand the limitations involved with the models,’ says Marshall. ‘But as it becomes more realistic and multidisciplinary, people take the models as being the absolute truth. One needs to develop a deep understanding of why they do what they do. It is very easy to give the right answers for the wrong reasons.’
In fact, academics are now spending a significant amount of time in deflating the expectations of the simulations. James McWilliams of the University of California, Los Angeles, recently published an article in the Proceedings of the National Academy of Sciences, to explain the reasons why it is not possible to ‘create a virtual copy of nature’, and to give a more reasonable framework for people to understand these models.
His main argument is that most of the physical laws used are experimental, rather than fundamental, particularly when it comes to multi-particle interactions such as the formation of clouds. He also argues that in many cases you are coupling two chaotic systems – such as the atmosphere, ocean and ecosystems – which are bound to be sensitive to discretisation, and can only produce a limited degree of accuracy.
Sceptics beware, however. All the scientists consulted for this article fiercely maintain that computer models are still invaluable tools to prevent disaster in years to come. ‘It would be a great disservice to me, and the rest of the world, if people decide to forget about global warming,’ says McWilliams. ‘It is not considered possible to forecast the weather a month ahead, but no one would throw that away.’ Computer models are the only view, however undefined, that we have of the future.