Skip to main content

The future of currents

Tuvalu, a group of islands in the Pacific and the second-smallest country in the world, may no longer exist by the end of this century. With virtually no industry and no cars, it produces less carbon pollution than a small American town. Yet a 50cm rise in sea level would almost completely destroy it, displacing 11,000 citizens, along with thousands of other inhabitants from low-lying islands such as the Maldives and Kiribati.

A one-metre rise, and the impact would be even more drastic. An estimated 15 per cent of Egypt’s arable land would be lost, along with 6 per cent of the Netherlands, and 17.5 per cent of Bangladesh. Some 72 million people in China would be affected. That’s not to mention the damage caused by ‘startling trends’ in storms and hurricanes, described by one scientist interviewed for this feature.

The trouble is, scientists don’t know what the change will be, with reasonable estimates ranging everywhere between the two sea level figures. Predictive models must be able to capture the sweeping global cycles of the thermohaline circulation churned by the ocean currents, the dynamics of landscape, the impact of volcanic eruptions and human activity, right down to the molecular processes that trap a small fraction of the greenhouses back in the ocean. Just trying to keep track with the different ingredients can make one’s head swirl, let alone projecting it across the 510m km2 of the Earth’s surface, 100 years into the future.

It’s a Herculean task, even for the world’s most powerful supercomputers, but a necessary one nonetheless. Scientists, activists and politicians are hungry for evidence either for or against climate change, and the hunt has generated some ingenious solutions.

Like almost every computer model, the globe is simplified and split up into a grid, and the relevant equations are solved at each point. Even this creates complications. Most modern supercomputers would buckle under the strain of a resolution finer than 100km, but this is already too large for the relatively small ocean currents like the Gulf Stream, measuring just 80km in width at some points. In addition, flows in the deep ocean, and in shallow coastal areas, need to be examined differently.

To solve this, Professor David Marshall and his team of physical oceanographers at the University of Oxford, have developed a new generation of models that can adapt the size and shape of the cells used in the model, as the fluid evolves.

‘It’s very much blue sky research, and we feel it could have a big impact on ocean modelling,’ says Marshall. His team are studying the large-scale circulation of the oceans, and how it is changing due to global warming. In the past, thermohaline circulation, which moves hot water from the equators to the poles and back again, halted as a result of a rapid transition from cool to mild climates. If this occurred again, it would have catastrophic effects for all of the world’s climates. Some scientists predict it could starve the oceans of oxygen, leading to mass extinction of sea wildlife.

The lack of experimental observations surrounding the research, however, is seriously reducing the power of these simulations. It is an unfortunate paradox that temperatures have never risen in such a way since the end of the last Ice Age, making the predictions necessary in the first place, but the computer models themselves are only as accurate as the data that is fed into them.

It’s a fact that sceptics have jumped on, often mistakenly. Just last month, scientists at the National Centre for Atmospheric Research (NCAR) in America found that arctic ice is actually melting even faster than models predict, because the processes involved were far more complex than had ever been imagined.

The NCAR’s models take into account a seemingly endless list of inputs, including a sophisticated method of coupling the separate models for land, ocean, and the atmosphere. They are three-dimensional, and they account for vegetation and different types of land surface. The level of detail is enormous, with solutions for every hour of the predictive time frame, which could last for more than a century.

However, what the scientists failed to predict was the lubricating effect of water melting from the icebergs, which percolates down to the bedrock and allows them to slide out into the warmer water to break and melt. Equally, they had no way of knowing that these icebergs were preventing the flow of glaciers into the warm sea, which are now releasing their water into the oceans at a much quicker rate. With new research in Greenland feeding these factors into the predictive framework, it is hoped the models will more fully explain what is happening in nature.

Says Jerry Meehl, of the NCAR: ‘The pace is more rapid than ever before. We’ve got the difficulty of modelling something that we’ve never observed. We try to reconstruct past ice ages, but the evidence is limited. We can only indirectly infer results, but we are beginning see a big instability in the ice sheets.’

If it seems like an uphill struggle, recent developments in high-performance computing are providing some relief. ‘The models are of such a high level, and so complex, that the fastest supercomputers are needed,’ continues Meehl. ‘It’s massively faster than it ever has been; we can now model 10 years in one day.’ However, the storage and access of data is still a problem.

With these advances, it would be easy to assume that in time, with a deep enough understanding, and more powerful computers, the models could effectively create a whole world on a desktop. Meehl and Marshall, however, are keen to emphasise that the models are just a tool to provide a deeper understanding of the processes involved; they can’t provide the reliability you may expect when modelling a car, for example.

‘I worry that people used to understand the limitations involved with the models,’ says Marshall. ‘But as it becomes more realistic and multidisciplinary, people take the models as being the absolute truth. One needs to develop a deep understanding of why they do what they do. It is very easy to give the right answers for the wrong reasons.’

In fact, academics are now spending a significant amount of time in deflating the expectations of the simulations. James McWilliams of the University of California, Los Angeles, recently published an article in the Proceedings of the National Academy of Sciences, to explain the reasons why it is not possible to ‘create a virtual copy of nature’, and to give a more reasonable framework for people to understand these models.

His main argument is that most of the physical laws used are experimental, rather than fundamental, particularly when it comes to multi-particle interactions such as the formation of clouds. He also argues that in many cases you are coupling two chaotic systems – such as the atmosphere, ocean and ecosystems – which are bound to be sensitive to discretisation, and can only produce a limited degree of accuracy.

Sceptics beware, however. All the scientists consulted for this article fiercely maintain that computer models are still invaluable tools to prevent disaster in years to come. ‘It would be a great disservice to me, and the rest of the world, if people decide to forget about global warming,’ says McWilliams. ‘It is not considered possible to forecast the weather a month ahead, but no one would throw that away.’ Computer models are the only view, however undefined, that we have of the future.



Wind turbine efficiency

Computational fluid dynamics (CFD) has long been used to improve the state of the environment by increasing the efficiency of cars and reducing the amount of fossil fuels burnt. The Star-CCM+ software from CD-adapco has now taken a more direct role, by helping to design more effective and stronger wind turbines that can withstand the heavy forces of rugged weather.

The stronger the winds, the more energy that can be harvested, but unfortunately this wreaks havoc on the turbine’s lifespan. The new design features a shield lying outside the blade that funnels air over the blades, increasing the amount of usable energy while also providing protection from other, more destructive, forces.

Previously, only extensive experiments in wind tunnels would have provided the result. ‘CFD often gives more detailed information,’ says Alex Read, manager of sales support and technical marketing at CD-adapco. ‘The experiments can be done in a virtual environment, without expensive physical testing.’

Tackling water supplies

Like many big cities in the developing world, Kampala, the capital of Uganda, has spawned a large slum area, with the country’s poorest people hoping for more money and a better standard of life – hopes far removed from the reality of the situation.

‘It’s very highly populated, based in valleys and wetlands,’ says Robinah Kulabako, a PhD student at Makere University in Kampala, who is using computational fluid dynamics to tackle the problem. ‘Most of the people earn less than $1 a day. They don’t have a water supply, there’s inadequate waste disposal, and no storm water management.’ The water table is only half a metre from the surface, and feeds into their drinking water with devastating results.

Cholera, typhoid and dysentery are all common, as is ‘blue baby syndrome’, an illness caused by nitrates in the water supply that inhibit the transfer of oxygen to infants.

To help improve conditions, Kulabako, under the guidance of Roger Thunvik of the Royal Institute of Technology in Sweden, is using Comsol’s Multiphysics program to simulate the flow of water in the region, and learn how the pollutants are being transported throughout the water table.

She is currently working on the transport of phosphorous, which causes algae blooms in the water and kills fish, one of their most important food supplies. Results from the project have so far shown that large pores in the surface are accelerating this flow after a heavy shower of rain.

‘Once we have a full understanding, the model will allow us to identify preventative measures,’ says Kulabako. ‘We’ll be able to decide what kinds of materials to use in wetland areas, to improve drainage and protect the ground water from contamination.’

The team are now trying to improve the mathematics of its models in Comsol Multiphysics. ‘The beauty of Comsol Multiphysics is that is allows us change the equations ourselves.’

The Earth Sciences module proved to be particularly useful, as it is specifically designed to track the transport and transformations of chemicals in fluid flow.

The use of commercial software for environmental purposes seems to be on the increase. Wolfram has recently showcased a demonstration of Mathematica that uses real meteorological data to help predict the movement of air pollution from factories, which again traces the way the chemicals react with one another, and the effect of landscape on this flow.


This model from Wolfram's demonstration predicts the spread of pollution using real meteorological data.


The advantages of using ready-made software are manifold. ‘Traditionally, the equations would be broken into finite elements,’ says Schoeller Porter, a senior software engineer at Wolfram. ‘With Mathematica, they can be solved symbolically, or numerically.’

Ready-made software can often integrate other programs more easily. In the Wolfram demonstration, gridMathematica was used to provide parallel computing power, and databases from the Geographic Information System and the Environmental Protection Agency provided greater depth to the problem.


Topics

Read more about:

Laboratory informatics

Media Partners