Follow the money

Felix Grant remains intrigued with financial modelling

When I wrote ‘Getting fisical with phynance’[1], just over a year ago, I made no secret of the fact that in finance I was a stranger in a strange land, surprised by what I found. In fact, that was my brief: to go into a new field with fresh eyes, and report back on how it looked to me at first sight. I didn’t expect that to be more than a one-off visit but, to my surprise, I found myself intrigued.

Interest, even with a modelling background, doesn’t make an expert; but it has ept me travelling, and knocking on doors. At that time I was more concerned with ideas than with effects, but you can’t spend very long in the field – even as a tourist – without becoming aware of the same trends that are visible everywhere in science.

Human beings are moving out of the computational areas to concentrate on strategic management, while black box systems crunch ever-greater quantities of data at ever-greater speeds at the applied sharp end.

One IBM report in 2006[2] described the development of such systems as an arms race, and these new machine traders as ‘akin to robots with machine guns’; another[3] predicted that only one human trader in 10 will survive the next seven years. London’s Stock Exchange Electronic Trading Service (SETS) is already handling more than 65 per cent of all trades as I write this, and rising, with the trade volume rapidly increasing as well. Aly Kassam, a former broker now working for The Mathworks, mentions tick-based systems passing data every millisecond in halfmegabit packages. Science may use less bellicose language than finance, but the parallels are clear. High-volume analysis at high speed is crucial to most areas these days, from mapping galaxies to modelling genomes.

On the human side, demand continues to grow for science graduates to feed this expansion – and so does the requirement for specific and relevant software skills. Steve Wilcockson, financial applications manager at The Mathworks, points to anecdotal public advertising indications of quadrupled human resource requests for Matlab experience in finance sector recruitment between 2006 and 2007, and a further doubling from last year to this.

Close relationships between the finance sector and software vendors with science or engineering backgrounds tend to go back about 10 or 15 years. Kassam traces strong Mathworks links over that period – with banks in particular. Igor Hlivka, of Mitsubishi UFJ Securities International (MUSI) Quantitative Analytics Group, describes the use of Maple on a similar timescale. In both cases the vendor brings to the table methods developed through experience in other areas, from pure academic mathematical research to mission-critical aerospace applications, ready-made and ready-tested.

Wilcockson points out that the institutions are buying into ‘not just a numerical environment, but a platform, quick to deploy within trading analytic and pricing environments alongside other programming tools. Big drivers are application development speed for adaption to market and deployment.’ Over at MUSI, Hlivka makes similar points, citing the advantages of ‘powerful, robust, flexible, intuitive and user-friendly’ software with the ‘ability to prototype models quickly and effectively … for implementation work on other programming platforms’.

Kassam highlights the flow of expertise which, despite cut-throat competition in the sector, produces cross-fertilisation. Practitioners within banks move out and set up independent operations using innovative methods; the banks move to adapt their in-house systems in response, seeking to import the observed benefits of that innovation back into their own operations. He emphasises the primacy in this field of execution speed, countering talk of limits with examples of how it can be spread to ever greater market footprint (‘108 Monte Carlo simulations instead of 105, for instance, has to improve your reliability’) even if the effectiveness of linear application falls off. He also highlights the extent to which this demand for speed is increasingly technology based: from grid and 64-bit platforms to embedded DSP or field programmable gate array (FPGA) boards that hive off irrelevant load away from core processors.

Banks and finance houses are obvious examples of quantitative finance applications, but not the only ones; one Matlab case study concerns Monte Carlo-based projection of databased analytic results onto future estate investment yield and risk estimates by Investment Property Databank (IPD). Work was initially started in Excel, for its familiar interface, then developed in Matlab using the Optimisation Toolbox and data exchange through Excel Link. The final artefact, a COM object generated through Matlab Builder, is accessed from Excel without necessity for a Matlab installation.

The flow of expertise is not all a one-way street from science to finance. Salim Iqbal, for instance, an agronomist working on fracture plane shifts in fertility and toxicity in soil and irrigation under changing patterns of climate and pollution, recently shifted to working with Maple. Justifying the upheaval involved, he quotes many of the same characteristics as Hlivka, including the recently rewritten statistics package and the handling of jump diffusion PDEs – both of which have benefited from use in the financial sector. ‘I’m not a mathematician,’ comments Iqbal, ruefully, ‘and I don’t do calculus for fun – but I learnt these firstand second-order derivative methods from a mathematical finance colleague, and ported them over along with the hassle-free implementation.’

Despite the emphasis on speed and pragmatism, conceptual developments have, of course, neither gone away nor become less relevant. In finance as much as in science there is a limit to how far speed and volume can usefully grow without new advances to shape their application. Theoretical work continues apace, alongside all this hectic spiral of capacity. Of particular interest to me, for various reasons, is the hidden Markov modelling of financial time[4].

Elsewhere[5], published as I was writing this, a fascinating copula-based, fractionally integrated, autoregressive model for transaction processes is applied to historical trading data drawn from New York Stock Exchange as a means of investigating granular hypotheses. A recent paper[6] proposes fixed-width, simulation-based confidence intervals across multiple risk expectation maxima.

Another[7], in the same journal a month later and of particular topicality in the current recessionary economic climate, develops a queue-based structural risk model for economies dependent upon mutual credit, which it then demonstrates within a specific industry. Also topical (though unfortunately not publishable in any detail) are forensic investigations employing sophisticated quantitative finance derived from biophysics, into money-laundering and suspected sub-prime mortgage delinquency.

But perhaps the most compelling link of all between applications of data analysis, in physical and financial arenas, is not to be found in the methodologies carried over from one to other. The really striking thing that shows through in conversations with graduates working on both sides of the fence is the identical language of intellectual challenge that they use. In the words of one young mathematical physicist working in hedge funds: ‘Euros are just as much fun as quarks, and they are better funded.’


1. Grant, F., Getting fisical with phynance, in Scientific Computing World. 2007, Europa Science: Cambridge. 1356-7853.

2. Bear, K., et al., Tackling latency: the algorithmic arms race. 2006, IBM United Kingdom Limited: London.

3. Dence, S., D. Latimore, and J. White, The trader is dead, long live the trader! A financial markets renaissance. 2006, IBM Institute for Business Value.

4. Messina, E. and D. Toscani, Hidden Markov models for scenario generation. IMA Journal of Management Mathematics, 2007, Milan.

5. Nolte, I., Modeling a Multivariate Transaction Process. J. Financial Econometrics, 2008. 6(1): p. 143-170.

6. Lesnevski, V., B.L. Nelson, and J. Staum, Simulation of Coherent Risk Measures Based on Generalized Scenarios. Management Science, 2007. 53(11): p. 1756-1769.

7. Cossin, D. and H. Schellhorn, Credit Risk in a Network Economy. Management Science, 2007. 53(10): p. 1604-1617.


Maplesoft info@maplesoft.com

MathWorks info@mathworks.co.uk

Mitsubishi UFJ Security Industries enquiries@int.sc.mufg.jp

Analysis and opinion

Robert Roe investigates some of the European projects focusing on preparing today’s supercomputers and HPC programmers for exascale HPC


The Jülich Supercomputing Centre (JSC) at Forschungszentrum Jülich in Germany has been operating supercomputers of the highest performance class since 1987. Tim Gillett talks to Norbert Attig and Thomas Eickermann


Gemma Church investigates how simulation and modelling de-risks the detection and extraction of geothermal energy resources


Robert Roe investigates the importance of upgrading legacy laboratory informatics systems and the benefits this provides to scientists