Software makes mistakes less costly

Tom Wilkie hears about the human side of software modelling and simulation

In the design of cars and aeroplanes, any mistakes can cost lives and huge sums of money. However, if engineers make those same mistakes in software, before the physical prototype has even been built, then the consequences are much less costly.

This image of software as a playground in which engineers are free to make mistakes – provided they learn from them – was a slightly unexpected perspective to emerge from Matlab Expo, a conference and exhibition organised by MathWorks on 8 October at Silverstone, Northamptonshire, UK. About 900 people registered (although not all of them turned up) to hear a day’s worth of talks intended to illustrate the application of the software to engineering and science.

The case for modelling and simulation in engineering design was put simply by Kevin Daffey, head of electrical power and control systems for Rolls Royce: ‘You couldn’t design an aircraft engine without it,’ he said. But his overview of the range of applications in which software plays an indispensable role perhaps surprised the audience, many of whom were specialists in niche areas, by its breadth. It ranged from familiar topics such as Finite Element Analysis (FEA), and Computational Fluid Dynamics (CFD) to slightly less obvious areas such as materials engineering. Yet it was important, he said, to be able to predict how materials would behave, and he noted that support was needed from both universities and Government to develop the codes that could be used by the whole industry.

For Rolls Royce, he pointed out, other software models were also important – especially thermo-mechanical modelling. The company had to make business-critical decisions often supported by the modelling scenarios, so any mistake that got through the design stage could be costly. Designs had to be validated in practice, he continued but, because there are so many parameters that could be varied, and because some of them may interact with each other, Design of Experiments (DOE) was high on his list of essential software.

DOE also figured high in the priorities of Bob Lygoe, who specialises in powertrain calibration computer-aided engineering and optimisation at the Ford Motor Company in Essex, UK. He pointed out that there was a trade-off between engine efficiency and limiting emissions so that car engine efficiency had decreased by more than 8 per cent over the past couple of decades as a result of pressure to meet increasingly stringent regulations on the emissions of carbon dioxide. Because external constraints on the efficiency of car engines could change, by modifying ‘one factor at a time, we can end up with a non-optimal solution’. Using Design of Experiments methodology, however, would allow engineers to find the Pareto front – the set of choices where it is impossible to improve one parameter without worsening at least one other. By restricting attention to the set of choices that are Pareto-efficient, a designer can make trade-offs within this set, rather than considering the full range of every parameter.

Ford is working with Land Rover, Johnson Matthey, ITM Power, Revolve Technologies, Cambustion, the Universities of Bradford, Liverpool, and Birmingham, on a research and development project called CREO (for CO2 Reduction through Emission Optimisation) to improve the car engine and after-treatment (exhaust) as a complete system, while achieving a 15 per cent reduction in CO2 emissions. The project includes three independent technologies: the on-board generation of hydrogen to improve combustion and after-treatment efficiency; the use of new techniques to allow multi-objective optimisation of the total powertrain; and a new look at catalyst formulations including changes to take advantage of the above.

The project, Dr Lygoe said, was an exercise in model-based optimisation that looked as if it could yield improvements in fuel economy of between 3 and 5 per cent, in contrast to the previous trend where fuel economy worsened as emissions were reduced. It was, he said, compute-intensive work and may need to be migrated to parallel processing.

But a long view is required to discern the full benefits of model-based simulation and engineering, because they become apparent only over time, according to Sanjiv Sharma, an EADS expert in modelling and simulation methods and tools for Airbus. (EADS has some 4,600 users of Matlab, he said.)

He reminded the conference that although modelling and simulation cut costs by reducing the amount of physical simulation that has to take place, it still takes time to build reliable models in software. And time is needed also to engineer the model for reuse but it is this engineering and simulation that ‘creates knowledge, which informs design decision making and reduces time to market.’ This was why, he stressed, one of the benefits of software modelling was to ‘have an environment in which making a mistake is not costly’. But he cautioned: ‘The intended impact takes place over time and the benefits may not be realised in the first product.’ This means that the full benefit may not be apparent until the second product is designed and, in the context of the aircraft industry, this could mean a time-scale of eight years or more.

Although Sharma was speaking in a different session from Kevin Daffey, both men highlighted some very similar themes. Sharma stressed the need to have the designers working alongside the modellers, because in this way the knowledge became embedded. In an unconscious echo of Sharma’s theme that it takes time to build up expertise, Daffey pointed out that the cost of licences for the software was only part of the total cost of ownership. There was a cost in developing new routines and in training, he said. Students often came out of university having gained familiarity with Matlab informally, through seeing classmates playing with the software. As a result, he said, 'Students come to us with bad habits and have to be retaught.'

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers