Going with the flow
Peter Coveney, honorary professor of computer science, University College London
At one time there were only two types of scientists, the experimentalists and the theoreticians. Some argue that using computational models has led to the emergence of a third class of scientist somewhere between the two. Peter Coveney was an early pioneer of using computational methods to connect the behaviour of matter on a small scale to the observed macro properties of fluids. This has set him on the road to making simulations that are so large scale they are capable of describing the behaviour of an entire system. From his first simulations of fluid flow for the oil industry he has moved on to simulations of blood flow in the brain and beyond, he hopes, eventually to the whole human body.
Along the way he has had to fight to get the computer power he needs, which has brought him into the political world of resource allocation on an international scale. In this he has proved persuasive and successful. He is prepared to tackle the difficult issues that the new generation of computing grids are throwing up, such as who gets what access when, and who should pay for it. He brings the same level of skill to these problems as he does to his science.
He believes that we are only just beginning to see the potential of modelling in many fields of science and is able to successfully argue for more resources for high performance computing that will benefit all areas of science, not just his discipline.
Coveney is professor of Physical Chemistry at University College, London and founding director of its interdisciplinary Centre for Computational Science, as well as being an Honorary Professor of Computer Science at UCL.
He is also well known as the joint author of two well-respected popular science books, The Arrow of Time and Frontiers of Complexity, which he hopes will help inspire the next generation of scientists to achieve even more.
Professor Bruce Boghosian, Chair of Mathematics at Tufts University, has worked closely with Coveney since 1993 when Coveney joined Schlumberger Research and Boghosian worked for Thinking Machines. Boghosian is now a visiting fellow at UCL and they still collaborate.
Boghosian says: ‘Peter thinks very deeply about things. He is very good at problem solving. He is good at critically understanding what is preventing a solution to a problem, getting to the core of it, fixing it and getting the results. That depth is true, both scientifically and administratively. All of the judgements that he makes are informed by a very deep understanding of the statistical physics involved and the dynamics involved, at a very fundamental level. He is focused on very large simulations, it’s true, but specifically on running them to get scientific answers to fundamental scientific problems.
‘The same is true administratively. Sometimes people who are very good scientists are not very good administrators and vice versa. But Peter is both. He seems to apply the same problem solving skills in both arenas. If some administrative issue arises, for example the way that time is allocated on a supercomputer, grid or network, he is able to see what he needs and explain why the resources he needs are critical to solving the problem and why the problem is important. He can explain this to the decision makers who are then able to get him those resources.
‘He has a way of cutting down to the problems that need to be solved, but with a sense of humour that means he is listened to. ‘He has been very successful at getting resources and part of it is the instinct he has for explaining things to people and why it’s important. It is the same skill that he has brought to his popular science books. He can turn his research into a story that people can understand. He is as happy talking about supercomputer account allocation policies as he is talking about the fundamentals of the universe.’
Boghosian says that Coveney has made a significant contribution to connecting the small scale world to the large scale world, for example in fluid dynamics. He says: ‘The Lattice Boltzmann models he uses are built on top of fundamental kinetic equations, and the models of fluid dynamics emerge from this. The Navier Stokes equations are difficult to simulate, so you go one level down and Navier Stokes is emergent for the same reasons they are emergent in a real fluid. We know from kinetic theory that all you need is particles colliding and conserving momentum and if you look at them in bulk you get fluid dynamics.
‘He is both a creator of the algorithms and models and he is also running some of the largest simulations in the world using those models.’
Coveney was born in Ealing, West London. His parents were both linguists and the family travelled a lot. His father was a translator at the United Nations in New York at one stage. His father then took an academic job at the University of Bath and Coveney spent his teenage years in the city. He was very sporting, playing football and cricket (he later played football for Oxford University). He started becoming interested in science at the age of 11 or 12 when he started wondering why things happened, like why water evaporated. He also had a go at some amateur experimental chemistry in his garden shed. He achieved well at school, particularly in chemistry, and he won a scholarship to Lincoln College, Oxford. One of his teachers there was the theoretical chemist Peter Atkins, who had written many standard textbooks on chemistry and is a well-known character in the field.
He finished his degree with a year of research with Professor Patrick Sanders at the Clarendon Laboratory Physics department. His dissertation was in fundamental interactions between elementary particles in diatomic molecules. He describes this as an unconventional research project for a chemistry graduate. It won him a year’s fellowship at Princeton, where he dallied with the idea of getting involved in particle physics. But he gained a scholarship back at Oxford and went back to chemistry, but at a very fundamental level.
He obtained his DPhil degree and stayed at Oxford as a junior research fellow. At this point he started a strand of research that has become very much a theme of his life’s work, which is trying to relate the properties of atoms and molecules to the macroscopic behaviour of matter, particularly the direction of time and statistical mechanics. This took him to the Free University of Brussels, where he had a fellowship to work with Nobel Chemistry Laureate Viscount Ilya Prigogine, who was also interested in the concept of irreversibility.
He was later to co-author a popular science book called The Arrow of Time with a former Oxford colleague, Roger Highfield, who went on to become science editor of The Daily Telegraph, a leading UK national newspaper. This book is regarded as credible by fellow scientists as well as being a commercial success.
He started looking for a permanent academic position. He says: ‘This was Thatcher’s Britain and so there was rather a dearth of openings for academics in the permanent domain.’ He spent three years at the University of Wales in Bangor, some of which time he spent working on his book. It was a tenured position, and about six months after he joined, tenured positions were abolished in the UK. He also started to become interested in industrial research and did some consulting for Schlumberger.
He started thinking about his future and considered moving to the US but his wife did not want to abandon her career working for a large UK oil company. Coveney decided to join Schlumberger Research in Cambridge in 1991 when his wife took a job in the London headquarters and they have lived in London since then.
He says: ‘Schlumberger is always promoting the use of technology in oil exploration and has always spent the same proportion of its profits on research, which goes back to the days of its founders, the Schlumberger brothers. That experience turned me on to the potential of computing, because the company did everything with computers. I learned what was possible with computers.’ The company had bought a Thinking Machines Connection Machine (CM-5) massively parallel computer, at which time Coveney met Boghosian, who was one of its senior scientists. Boghosian and Coveney started using the machine to develop lattice gas models of fluid dynamics to create the macro properties of a fluid from the properties of the microscopic particles on a lattice.
He says: ‘This type of model is particularly amenable to parallel computing; you can build very large scale models of fluids. You define very simple local rules and the emergent properties can be quite organised. We were able to design some quite complex multicomponent fluid models. I was being encouraged to do all this work and given all the resources I needed. I didn’t need to spend my time with grant applications, fighting with other people for resources, and it was a great opportunity to do the sort of things that I wanted to do. It was a very exciting time for me.
‘Schlumberger was also very keen on visualisation and I always want to see the results of these models. I also liked the concept of computational steering, to change the parameters while the simulation is running. I was able to initiate all these things at Schlumberger.’
In 1995 he published another book with Highfield called The Frontiers of Complexity, which explored the subject of complexity through mathematics, number and game theory, and computer systems to the latest research on complex biological systems.
Coveney still had ambitions to return to academia – he was supervising PhD students based at the Theory of Condensed Matter Group in the adjacent Cavendish Laboratory on wide ranging topics – and funding was starting to become available. In 1999 a chair in physical chemistry at Queen Mary College, London became available and he was offered it. It was more convenient than commuting to Cambridge and other positions available were outside the capital. He spent three years there and they proved eventful. He received a large government grant to run a ‘RealityGrid’, looking at distributed high performance computing.
He says: ‘I have been fortunate enough to get a lot of funding from the EPSRC and e-Science is what I am interested in, which is advancing science through computers. I do a lot of different things, but the unifying concepts are those from computational science in a lot of difference areas. E-Science was about getting access to a large number of computational resources and being able to orchestrate your interactivity with them easier and faster, because of the infrastructure in place. Through Bruce Boghosian I have been lucky enough to have access to the US TeraGrid.
‘There is an intoxicating vision of what the Grid should be, but that remains a long way from where we are today. Going from where we are to where we want to be is pretty difficult, because it requires all sorts of people to cooperate to create mechanisms for users to interact with computing resources in ways that they want, which is different from the way these things are traditionally managed.’
Coveney set up a Centre for Computational Science at Queen Mary to look at ways that his approach to modelling could be applied to wider fields, particularly the emerging world of computational biomedicine.
As much as he loved Queen Mary, after three years a position came up at University College London. He says: ‘I moved here, because the general level of excellence in academic research and the culture of working across disciplines are very strong here and the culture is more aligned to that. It has a close relationship with several hospitals.’
As a bigger institution it had more resources and infrastructure support. Also it was a more famous research institution, so he realised he would pack more punch. His research group upped sticks and transferred to UCL in 2002 where the group has grown considerably. It has also become involved in larger and larger modelling projects. The availability of larger grids has made ‘whole system’ modelling feasible for the first time in recent years and Coveney has been at the forefront of this work.
He has become involved in some major biomedical projects including a project to model the blood flow in a human brain using CT data. He has recently assumed a leadership role in the new European Union’s Virtual Physiological Human initiative (2008-2013).
What has emerged is that there is a new class of application called ‘urgent computing’ for areas such as clinical decision support. A doctor may be able to use a computer model to help diagnose a patient who is about to have a stroke and if the calculations can be done quickly enough, may be able to intervene and prevent it. There needs to be a protocol to pre-emptively monopolise a grid for a short period, but only occasionally. In the US this approach has been proposed for disaster prediction and recovery. This is one of many access and accounting issues that grid computing has thrown up, let alone the technical problems of fully utilising the grid structure.
One of the problems of his large-scale modelling is that it requires access to the world’s biggest computers and the field is becoming another kind of Big Science. This means that in recent times he has had to spend more time on politics and less time on research, but that is the price he is prepared to pay. He has become seriously involved in many of the groups and committees that co-ordinate the government resources and support. He says: ‘It is certainly true that I have to spend a significant amount of my time discussing the nature of and access to these resources, but given the benefits that are emerging we have no choice other than to argue our case.’
He is broadly happy with the support given to high performance computing from the UK government; something that researchers in other fields of Big Science, such as astronomy and particle physics, might not be able to say at the moment.
In fact he believes that there is a certain amount of resentment from other fields about the amount of support it gets. But he believes that modelling and simulation, in particular the new whole system (integrative) approach, has so much to offer all fields of science at the moment that the principle task he has is persuading scientists in other fields to embrace this approach.
He says: ‘The main thing is that, given the immense computing power available these days, we are able to address problems outside the conventional domains; but the medical research funders, for example, do not yet have a full understanding of what can be done with these resources, so the opportunities are there to be explored. Some fields of research are not as familiar with using modelling and simulation and so we need to make the case to them.’