Simulation through the years

Bruce Klimpke, technical director of Integrated Engineering Software, looks at 25 years of simulation software

It is 25 years since we produced the first commercial software using the Boundary Element Method (BEM) to simulate electric fields. It was to revolutionise the way Maxwell’s equations are applied in physics and electrical engineering.

Computing power then was a very expensive Intel 8088 processor with 8087 coprocessor, monitor, 640KB of memory, 10MB hard disk and special graphics card. It was a major challenge to solve real world problems.

With the advent of the mouse and colour monitors, it became possible to create two-dimensional geometry, as information the computer needed could be displayed and verified interactively. This meant that data to do finite element or boundary element simulations could be created and verified with a radical reduction in possible errors. Creating geometry, and following meshes associated with it, no longer required viewing reams of numbers hoping to find where a problem might be.

Extreme care was required in terms of memory management. Within the total memory space, the operating system, graphics libraries, code and all data space had to coexist. To solve practical problems, the executable code and data had to be carefully overlaid, to achieve minimum swapping between the hard disk and core memory. Developers would literally count the number of bytes needed for each small piece of data to minimise memory requirement. With everything optimised, the very first commercial codes were able to solve problems with about 200 boundary elements, or around 10,000 linear finite elements using DOS.

Even within these limitations, the benefits of simulation using software packages were immediately obvious as designers ‘saw’ the electric fields their design would produce. This was revolutionary, leading to design insights only dreamed of. Many assumptions of how a device worked were proved incorrect. Without viewing actual electric fields, these invalid assumptions were understandable because measurable quantities, such as terminal voltage or forces, would substantiate those conclusions.

A whole new industry was born and, although limited computing power required designers to still build and test devices, the seed was sown for this to reduce. Soon early adopters would not even consider design without simulation.

Following the success of simulating electric fields using BEM, other solvers followed, such as a 2D magnetic field solver, addressing considerably different markets. Thousands of devices use magnetic fields to transfer energy, forces or torques and pass information, including electric motors, sensors, NMR machines, and solenoids.

The relative advantages of the BEM over the more widely used Finite Element Method (FEM) are that for electric and magnetic fields, the governing equations to be solved are Maxwell’s. Expressed in integral or differential form, ultimately, both approaches give the same answer, but the numerical methods required to solve them are radically different. The differential form of Maxwell’s equations is associated with FEM and the Finite Difference (FD) method. The integral form of Maxwell’s equation is associated with BEM or the equivalent Method of Moments (MOM). The best method is one that gives the most accurate answers with the least amount of computing and interaction time.

By the early 1990s, processor speed was rapidly increasing, 3D simulations became feasible and we introduced our first 3D boundary element solver. Parallel to the development of CAE software was the rapid development of geometric modelling software. The first truly successful program, AutoCad, enabled designers to create two-dimensional drawings. In addition, geometry standards like IGES and STEP were created, followed by 3D geometry modelling tools. Geometry data could now be passed to everyone in product design and production. This highlighted the importance of the link between CAD modelling tools and FEM/BEM simulation tools as, once the geometry has been created, the simulation tools can use this data. Another major shift was the demise of DOS and its replacement by Windows.

Through the 90s, clock speeds continued to increase with the Pentium line of processors. For two-dimensional problems, hundreds of solutions could be attained in 48 hours. Designers now used a significant number of parametric variables in combination with optimisation algorithm to achieve optimal design, significantly reducing ‘overengineering’ a product. Now designers could detect that removing material would affect the integrity of a design, enabling them to maximise performance while minimising costs.

As algorithms continued to advance, coupling different areas of physics into one simulation became commonplace.

Now the rapid increase of clock speeds has stopped and the latest line of PCs are equipped with quad core multiple processors. This new parallel processing ability has revolutionised the processing speed of the boundary element method. Assuming proper code and data structure is attained, boundary element code is completely scalable. With 64-bit processors, the BEM code will run almost 64 times faster than a single processor. In the future the level of sophistication for simulation will usurp anything available today.

With simulation software, the design process has been radically altered. The insight gained by seeing the electric, magnetic, thermal, or stress fields cannot be matched. With today’s demands for producing high quality products at affordable prices with a fast time to market, simulation tools are indispensable.


For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe looks at the latest simulation techniques used in the design of industrial and commercial vehicles


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers