Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

The optimisation conundrum

Share this on social media:

Topic tags: 

Carlo Poloni, president of Esteco and Engineering Professor at University of Trieste, argues in favour of optimisation

To optimise or not to optimise? This is hardly the question anymore. Numerical optimisation has recently gained momentum among engineering and manufacturing companies, where such a principle is now integrated in the product design and development process. The overwhelming question has rather become: does optimisation still make a difference?

In the humble opinion of this writer the answer is ‘yes’. Coming from a long career in engineering, both as a professor and a businessman, I keep considering optimisation as a truly revolutionary tool to inspire innovation in the product design process.

To begin with, it represents a driving force for ‘out of the box’ solutions. By increasing exploration efficiency, advanced optimisation algorithms lead to designs that would otherwise remain hidden and that can save time and money. By moving the simulation phase to the beginning of the product life cycle, optimisation tools can reduce design and development time. The result is better coordination in product strategies and better planning.

The fascinating fact, though, is that it all comes down to the organisation, the team, and ultimately to the individual inclination. In other words, the whole optimisation approach requires a cultural shift in the engineering attitude and the confidence to let simulation-driven design become optimisation-driven.

But let us start at the beginning. Optimisation techniques have been used in engineering for a decade – primarily to maximise a performance metric, or to minimise the cost of a product for a given performance. A younger me, in the early 90s, on a bus to the Von Karman Institute for Fluid Dynamics, was having a conversation with a British Aerospace aerodynamic engineer about the optimisation of a wing profile. We pointed out that such a calculation would have required their entire CrayT3D parallel supercomputer. A few weeks later, the feasibility study of the approach was started and in almost a month, the vector-parallel computer was saturated, but our theories were proved.

More recently, the research for powerful algorithms has brought about the possibility of encompassing multiple opposing objectives within the same project. Targets like increasing efficiency and durability while reducing weight and cost are easily achieved. The quest for best compromises is identified along the so-called Pareto Frontier, representing trade-offs between the considered objectives, and giving the decision-maker a useful decision dashboard.

Nowadays, we are in the middle of a multi-disciplinary integration shift. Process complexity is increasing as dispersed and sometimes global engineering teams concur to improve product performance metrics. Different domain approaches and a large number of variables, constraints and objectives, related to different disciplines, all compete in the hunt for the best result. The solution comes from the integration of several powerful simulation tools and the automation of sophisticated workflows into a software platform, which can satisfy the need for cost-effective and repeatable design processes.

Such scenarios set a demanding challenge for R&D teams, but can turn into a great opportunity for companies willing to embrace the most innovative technologies and a new engineering philosophy. The design process becomes an iterative practice performed efficiently using technology, while the engineer concentrates on the decision making, based on trade-off solutions quantitatively determined or estimated with the software aid.

And that is only the beginning of the advantages arising from leveraging this powerful technology. On the IT side, the increased availability of distributed computational resources, offered by multi-core CPU, HPC, and high-speed interconnections, allows for ever-complex optimisation campaigns.

The next step up is moving optimisation to the product concept phase. By intervening at the earliest steps of the design process and evaluating the feasibility of certain configurations sooner, companies have the opportunity to boost their innovation assets and take product development to the next level.

Upfront optimisation becomes a strategic driver and helps shape the new design process: simulation, analysis, decision making, prototyping and testing are optimised to cut costs, but the real competitive advantage starts at the product concept level.

Exploring and evaluating configurations before competitors is crucial for companies striving with the manufacturing of complex products, while understanding key factors and variables dependencies ahead of time allows for a dramatic reduction of design cycle, cutting down time and further lowering development costs.

Does this not sound enough? If cost and time are not the only factors at play, are we really sure we can identify the best possible solution?

Giving free rein to optimisation techniques and embracing them, starting from the product conception stage, opens up another substantial advantage: the capability of pinpointing solutions that are completely innovative and have not been considered before. The most advanced genetic and evolutionary algorithms allow the boundaries of research to be pushed even further by smartly exploring the design space and identifying configurations that a traditional approach would not acknowledge.

Further methods, like MORDO – multi-objective robust design optimisation – are able to add real-life uncertainties to the equation. Engineering design optimisation problems often have parameters with uncontrollable variations, calling for solutions that in terms of objectives and feasibility are as good as possible and at the same time are least sensitive to parameter variations. A robust design is able to maintain a certain performance level or quality even if ‘noise’, simulating sampled and unpredictable external factors, is added to the process. MORDO is used to keep such uncertainties under control, granting real world effectiveness of the optimal solution.

Certainly that younger me of the 1990s would have been surprised to see what simulation is capable of and what is achievable within the field of optimisation – but even now I still believe that optimisation has a long way to go.