Skip to main content

Down-to-earth modelling

The earth, with its mountains, large bodies of water, volcanoes and other natural formations, is a laboratory where all physical effects are coupled. Further, modelling is extremely necessary here because of the timeframes in which we sometimes work. They can extend to months, years or even decades, and that also makes it difficult to verify a model. However, that’s not stopping geologists from developing and improving models that predict every aspect of our constantly changing earth. This article reviews just a few examples where geologists are using mathematical modelling, including code, to study hydrocarbon resources, glaciers, avalanches, magma flow, earthquakes and more.

Oil prices make it worthwhile

An area where geological modelling has played a key role for decades is in oil/gas exploration. However, explains Dirk Smit, head of exploration research for Shell, two factors have now given modelling an extra impetus: the affordability of computing power, and better acquisition systems now being affordable due to high prices for natural resources.

Seismic surveys of the earth’s subsurface have long been used to help determine where hydrocarbons might be located and where to drill wells for safest and most efficient recovery. And while the governing equations of seismic wave transport are well known, modelling has moved from a relatively crude technology in the 1970s to a much higher degree of sophistication – and we’re still not at the end, adds Smit. In previous times, even creating and using simple models, which assumed you could capture all important information with just a wavefront, was a very complex job. Those early models relied heavily on two assumptions to make for more simplified codes: first, that the subsurface is relatively transparent, which simplifies the description of the reflection process; second, that the relevant information from a reflected seismic wave is its travel time and amplitude. In contrast, today’s trend is to build models that are based on code that honours the full propagation properties of seismic waves, realising that there are many more effects, such as scattering from multiple subsurface layers. Therefore, much more information about the subsurface structure can now be inferred from close inspection of the previously neglected ‘tail of the seismic reading’.

The seismic industry has seen significant improvements over time, mostly enabled by new acquisition systems, together with increased computer power, such as the move from 2D ‘slice’ models to full 3D volumetric models, and even ‘4D models’, which Smit refers to as a bit of a misnomer. Here a series of 3D models over time can show changes in a hydrocarbon reservoir due to fluid extraction and movements. The technology has evolved such that one can now record several monitor surveys, which can be turned into animations. Here, too, the cost of acquisition and computing to get these complex images is today more acceptable and the return on investment is seen as beneficial.

Another trend, explains Smit, is the use of newer technologies beyond seismic imaging, for instance incorporating electromagnetic survey data. He doesn’t see that there will ever be a single technology that takes away all the uncertainties of exploration, but he believes that multiple technologies will be integrated into an improved comprehensive risk reduction approach – but doing so is not trivial.

Most of the code companies such as Shell run are proprietary, and while some areas of study refer to man-years of coding, here a better term might be man-generations. The backbone of the code concerns seismic processing and imaging. How to best exploit seismic data has become very competitive among energy companies, and the pace of innovation is much faster than before. Most of the companies at the forefront have their own codes for these key tasks. Smit adds that developing and implementing differentiating technology is very expensive and time-consuming. Therefore, in the world of low-cost oil, this was confined to large oil and gas R&D centres. Today, though, widely available computing power and the availability of capital also makes it possible for smaller companies to implement new ideas and make those available to the market. Hence it becomes more important for large corporations to decide what to focus their own R&D on, and how to leverage the market optimally.

To run its codes, Shell has had a distributed computer system made of a system of workstations with a capacity comparable to most large companies, but the company is now building up capacity that would be several orders of magnitude larger. For this, it’s not so easy to stack up a large number of PCs, and Shell is looking for new insights and developments in algorithm design and mainframe infrastructure.

Searching for minerals with New York Blue

While we have gained much general knowledge about volcanoes and have been able to model the flow of magma for rather uncomplicated situations, there is still much more we need to know about them. Their physics are not well constrained, and we need massive computational power and wellestablished robust codes to describe all the physical processes that are happening at once. So says Dr Alan Rice, a research associate at the Earth and Planetary Sciences Department at the American Museum of Natural History and also a lecturer at Stony Brook Southampton.

An artist’s conception includes results from an Abacus model that analyses the Sumatra-Andaman earthquake of December, 2004, which caused the famous tsunami. The red area represents the amount of vertical deformation of the ocean floor caused by the earthquake.

Studying magma flows, while scientifically very interesting, can also have significant applications, he adds. For instance, while people are well aware of the depletion of hydrocarbon-based fuels, there is less emphasis on the depletion of hard rock minerals, even given increasing demands from developing countries, and there having been no major discoveries in the past 30 years. With modelling, Dr Rice is studying how such deposits form, which should allow us to better predict where we can find new ones. For instance, heat from a magma chamber will initiate hydrothermal circulation in the surrounding rock. The convecting hot water dissolves minerals and carries them to new locations, and when the water cools, the deposits precipitate out. With modelling, he wants to study hydrothermal flow so as to locate these enriched areas of highly concentrated metal. Magma itself can be quite difficult to model considering that its viscosity can change by 10 orders of magnitude, from a liquid such as water to far thicker than molasses. In addition, there are phase changes as well as the convection of suspended crystal loads within the magma, the interaction of ground water, etc. Models have, for instance, revealed that magma flows can have very turbulent eddies, but they don’t exist long before the magma starts to glue up as it cools.

Recently, he has studied nickel deposits caused by lava flows. There is good computational evidence that nickel gets bound up in sulfides when the flow goes over a pit and a back eddy forms. Geologists have seen similar structures in the field, and now Dr Rice is advancing his modelling of this effect. Knowing the cooling rate, changes in viscosity and the segregation of crystal content, his models have been able to start picking up areas of nickel concentrations. Performing such models takes enormous computing power, not only because of the many physics involved – radiative, convective and conductive heat transfer, variable viscosity, phase changes, suspended loads, topographical changes and the fluid dynamics that continue even when the lava crusts over – but also because of the physical size; an area of study for a lava flow might be 10 x 100km with the time being studied ranging from days or even years. To do this, Dr Rice is running Ansys Multiphysics on New York Blue, an IBM Blue Gene computer that is a joint venture between Stony Brook and the Brookhaven National Laboratory.

Studies of the infamous tsunami

Undersea volcanoes are also the cause of tsunamis as evidenced by the Sumatra-Andaman quake in December 2004. Not everyone realises, however, there was a second quake, magnitude 8.7, 100 days later and a few kilometres to the south along the same fault line. It caused no tsunami, but it was of great interest to Dr Tim Masterlark, assistant professor in the Department of Geological Sciences at the University of Alabama. ‘It gave me a natural laboratory for studying how earthquakes interact with each other. We want to see if we can find causal relationships that could lead to more accurate predictions of quakes and tsunamis.’

Using Abaqus Unified FEA software from Simulia, he is developing dynamic 3D models, and while many papers on the Sumatra-Andaman event have been published, he believes he has the first FEA-based assessment of the event. Inputs to the model include surface-deformation data as well as seismicity, seismic reflection and tomography, gravity measurements and pressure measurements in water wells. Masterlark sets up a million-node multigeometry grid for displacement, plus others for variables such as pore pressure.

Another frequent natural disaster is a forest fire. What if the likely ignition locations could be predicted, and what if the spread of fires could be predicted and thus be controlled? Leading R&D in forest-fire management is a team at the US Forest Service’s Fire Sciences Laboratory in Missoula, MT. Analyses carried out by the team using Star-CD from CD-adapco involve the study of the flow of large air masses over topographical domains. Using DEM data, the landscape surface is read into Star-CD to provide an immediate shell representation of the ground surface. Once the boundary conditions have been added, which represent actual measured wind velocities in both 3D spatial and temporal reference frames, the analysis can be run. To protect their investment in existing in-house codes, the team links them directly to Star-CD to produce a dedicated framework suited to specific problems.

This plot shows the simulated3D velocity field for Storglaciären, a small valley glacier in northern Sweden (underlay photo by R. Hock). As the ice flow vectors show, the velocity is greater towards the glacier’s upper surface.

To simulate the heterogeneous real-world earth, Masterlark builds his 3D models by assigning different material properties to different regions using the Abaqus soils module (a subset of the materials database). Today such models are dependent on inverse analysis, working backwards after an event to estimate the characteristics of fault-slip at depth, which we cannot observe. The promise of forward models driven by the estimated fault-slip is that they will predict the location and time of future events such as a rupture beneath the ocean floor that might deform ocean bedrock and generate a tsunami.

Graphics cards as accelerators

One company that is helping to speed up models of oil and gas fields is Acceleware. This software company addresses vertical markets, including gas/oil, by combining a team of experts for a type of application, here geophysics, with hardware experts who exploit the latest computer technology. In particular, they are creating code for GPUs (graphic processing units) that contain 128 cores and 1.5MB of memory. They have libraries for finite-difference techniques along with matrix solvers for accelerating finite-element analysis. For oil/gas exploration, the company sells several products: one for accelerating the modelling interactions of electromagnetic waves and material structures, and two for speeding up the imaging subsurface geological structures from recorded seismic data. Electromagnetic radiation, notes CTO Ryan Schneider, can illuminate hydrocarbons below the surface, whereas seismic can show primarily just rock formations; using EM techniques, finding areas of different resistance can map pockets of oil and gas.

Not only is this software sold on its own, Acceleware also works with major oil companies, who take this core – which covers 80 to 90 per cent of the basic seismic or EM post-processing modelling technology – and then add their proprietary techniques. The software is optimised to run on massively multi-core accelerators; in fact, the graphics card developer Nvidia is a part owner of the company. In performing a pre-stack time migration, a popular method in seismic data processing, the core algorithm runs 15 to 20 times faster on a server with a GPU compared to a four-core server; when you include disk I/O and preprocessing, the overall speed increase is still 8x.

From as slow as a glacier…

A good example of geophysical events that take a long time to develop concerns the movement of glaciers. While such models have existed for a long time, they don’t take into account every effect, so researchers are looking at ways to refine them. For instance, explains Andy Aschwanden, a research scientist at the Institute for Institute for Atmospheric and Climate Science at the ETH Zurich, glaciers can have two kinds of ice: temperate ice, which is at the pressure melting point and thus contains a small fraction of liquid water and that does not freeze to the glacier bed, and cold ice, which is several degrees below the freezing point and that does freeze to the surface. Both types of ice can flow, but cold ice flows much more slowly. In addition, current ice-sheet models cannot capture large changes such as the dramatic increase in the flow velocity in outlet glaciers that drain ice into the ocean. Today, most glacier models have taken the simplified approach that all the ice is cold ice and they solve for temperature throughout the mass. And while most arctic ice is cold ice, the climate is changing, and Greenland, for instance, is heating up, which creates more melt water which in turn lubricates the glacier bed.

To account for both cold and temperate ice in a single model, Aschwanden is working with a different approach that, with the software Comsol Multiphysics, uses enthalpy as the underlying physics, thus allowing him to model polythermal glaciers that include both temperate and cold ice. The model considers two sources of heat: frictional heating due to strains when the ice flows and geothermal heat flux arising from the earth’s interior. He is treating ice as a fluid, albeit on a large time scale, and is trying to apply many concepts from fluid dynamics that are comparable to some of those for metals near the melting point. In fact, he has borrowed some research from the field of metallurgy and applied it to his research with glaciers. When the water content is below one to two per cent, you can use one momentum equation, but then the weight of the water becomes important and the glacier now acts like a porous medium. However, it’s very difficult to perform experiments on water flow within a glacier, so the modelling takes on even more importance.

To make it easier for the average geologist to set up and evaluate mathematical models, Comsol sells the Earth Sciences Module, an add-on to its core package. This module prepackages equations frequently used in these applications so they are more accessible to end users. It also comes with a Model Library containing ready-to-run examples complete with the full modelling code. The examples fall into five groups: fluid flow, solute transport, flow and deformation, heat transfer, and multiphysics.

…to as fast as an avalanche

The previous section looked at snow and ice moving at a very slow speed, but modelling is also important for the opposite situation – an avalanche, where snow moves at speeds to 200km/hr and more. Avalanches can have tremendous power; snow travelling at these speeds can generate ‘projectile-like’ impact pressures as high as 100 tons/m2 although the pressures are typically on the order of 30-40 tons/m2. As Dr Perry Bartelt says: ‘An avalanche is a very interesting system. Within 60 seconds one can travel several kilometres. It starts from rest and ends at rest, but in between it generates tremendous kinetic energy with considerable random activity. Snow avalanches, unlike debris flows or large rock avalanches, are ideal systems to study because they have almost repeatable conditions – snow gathers every winter and a slope discharges every winter, so modellers have multiple chances to verify their codes.’

In previous generations, farmers used local knowledge from their forefathers to get a good idea of where to build their farms and villages so they wouldn’t be affected by avalanches. Today, however, our modern society is pushing growth in vacation locales, especially near ski areas, and developers want the most picturesque views possible, perhaps in dangerous locations. In the 1990s the Swiss federal government laid the legal basis that gives local governments the responsibility of considering natural threats when zoning land and issuing building permits. These towns work from a ‘hazard map’ that in the case of avalanches shows where a building may not be erected (red zone), areas where new buildings must have special protection against avalanches (blue zone), and areas deemed to have little risk associated with avalanches (white zone). These maps are drawn up by consulting engineers, largely with the help of codes developed at the Swiss Federal Institute for Snow and Avalanche Research (SLF) in Davos, and that is used to predict avalanche runout distances and velocities.

In the past, engineers worked with a 2D program that shows a cross-section of an avalanche, but SLF researchers are nearing the completion of a 3D modelling program called RAMMS (Rapid Mass Movements). Inputs to RAMMS start with a DEM (digitised elevation model) of the area and also possible release zones, those areas where avalanches are likely to start. These, for instance, are at an angle of between 28 and 55 degrees; any flatter and snow is unlikely to flow, any higher and snow is unlikely to gather in any volume. The model also accounts for the depth of old snow and the new snow that is a likelier candidate for being part of an avalanche. A final input considers the vegetation for the area; for instance, stands of trees not only prohibit volumes of snow from gathering, but also can help brake avalanches.

The algorithms that evaluate the inputs have incorporated new ideas with roots in turbulence theory. The turbulence here does not consist of eddies, but rather the random motion of hard snow granules with the density of solid ice. It appears that the interaction between the granules and the terrain at the front of the avalanche generates the turbulence, but towards the tail of the flow the turbulence decays. The finite-volume code works with the depth-averaged Navier-Stokes equations for flow with an additional equation that describes the generation and dissipation of random energy and that also accounts for friction. The user interface for the code, developed by research scientist Marc Christen, uses the iTools component framework for the IDL language, which was developed by ITT Visual Information Solutions. To make the results easier to interpret, it is also possible to superimpose the results on a map from Google Maps.

This data is used for more than creating hazard maps. With the results, engineers can look at impact pressures to help dimension skilift masts, determine the size and location of avalanche dams and snow nets, calculate the contribution of forest areas in stopping avalanches, and also find the location of release zones over a large area such as a complete mountain range.



Topics

Read more about:

Modelling & simulation

Media Partners