Technology innovation drives change in automotive engineering
The automotive industry is changing at an unprecedented pace. The production and sale of electric vehicles (EVs), for instance, has have grown significantly over the last five to 10 years. According to provider of EV charging infrastructure, Virta, there were a little more than one million EVs on the roads globally in 2015; by 2021, that figure had jumped to approximately 16 million (1). To improve the safety and comfort of their customers, carmakers are also working to automate their vehicles.
Three main factors are driving these changes. First, regulators worldwide are defining more stringent targets for greenhouse-gas emissions. The European Union (EU) has, for instance, unveiled its Fit for 55 programme, which seeks to align climate, energy, land use, transport and taxation policies to reduce net greenhouse-gas emissions by at least 55 per cent by 2030. In the USA, meanwhile, the Biden administration wants sales of EVs to account to for 50% of total vehicle sales by 2030. Second, consumers are becoming increasingly open to more sustainable alternatives to traditional car travel. Finally, according to consultant McKinsey & Co, companies working to electrify, connect and automate driving technology have attracted more than $400 billion in investments over the last decade – with about $100 billion of that coming since the beginning of 2020 (2). This investment is accelerating the pace of change.
The Director of Business Development at Altair, Warren Dias, recalls a discussion he recently had with engineers at a European original equipment manufacturer (OEM), who told him that “the level of intensity that they are seeing around the development of EVs in the past five years is probably comparable to the last 50 years of the research and development of cars with internal combustion engines.”
Technology Evangelist, Autonomous Vehicle Simulation Solutions at Ansys, Gilles Gallee, adds that these drivers, and others, are creating significant disruption in the automotive industry. He continues: “New technologies, safety regulations and innovations in energy are making it possible for newcomers, such as Tesla or the Chinese original equipment manufacturers [OEMs], to arrive in the market. It is also creating opportunities for massive mergers in the industry and strong partnerships between automotive and electronics specialists.”
All of this creates significant challenges for engineers working in the automotive industry. They have, for instance, to adapt the designs of their vehicles to accommodate electrified powertrains. Dias characterises this change as evolution, rather than revolution. He explains: “Engineers have a lot of historical data regarding the performance of a given vehicle architecture that is baked into their design process. To change completely puts them into unfamiliar territory, so they are not as open to making these dramatic changes.”
Simulation therefore plays a key role in smoothing this transition to new vehicle architectures. Dias explains: “Engineers are asking themselves: ‘If I have a battery pack in there, how can it aid in the tortional or lateral stiffness of the vehicle? How can it contribute to the structural performance of the vehicle?’ There is no legacy data to go on.”
Altair has developed simulation tools that it claims can make this process easier. Dias continues: “We rely very heavily on our optimisation technology, which has been in our portfolio for almost 25 years now, and we have strung together a process that allows engineers to run very early concept studies for vehicle architecture. This enables them to use the same physics to come up with a design, and the validation process therefore becomes much more simplified and streamlined.”
The design of the electric powertrains themselves also throws up a vast number of variables. Altair is working to develop simulation tools to solve the multiphysics interactions that occur between batteries, motors and other power electronics. Using these tools, Dias says, “users can run, for example, an electromagnetic simulation, or map loads onto the structures to look at the stresses, or look at acoustics and see what the powertrain sounds like. A fluid simulation can be conducted to see how well the heat that is generated by the motor is dissipated.”
To aid its development of such technologies, Altair purchased Powersim, a provider of simulation and design tools for power electronics, in March 2022. Among other functions, Powersim’s PSIM software provides a set of design-verification tools, including Monte Carlo, sensitivity and fault analyses, to assist in the design failure mode and effect analysis (DFMEA) of power converters. It claims that these analyses are easy to set up, and the results can help engineers evaluate and improve the performance and reliability of their designs.
The development of EVs is challenging, but the technology is maturing rapidly. The technical hurdles that must be overcome to create fully automated vehicles (AVs), however, are still significant. Gallee says that AVs are being developed and brought to market in two different ways. The traditional carmakers propose a “progressive deployment of [autonomous driving] AD functions, from Level 2+ to Level 3 and more [see boxout], based on new electronic vehicle platforms. For the OEMs, safety is a fundamental pillar and they must provide an autonomous driving mode for every conceivable condition of use. The second path is more disruptive and addresses robotaxis and smart mobility. These newcomers and information technology [IT] giants are directly targeting self-driving cars at Level 5.”
According to Gallee, the first challenge presented to developers of AVs is technical. High-level AVs will rely on myriad sensors, actuators, complex algorithms, machine-learning systems and powerful processors to execute software. They create and maintain a map of their surroundings based on a variety of sensors situated in different parts of the vehicle. Radar sensors monitor the position of nearby vehicles while video cameras detect traffic lights, read road signs, track other vehicles and look for pedestrians. Light detection and ranging (LiDAR) sensors emit pulses of light that bounce-off the surroundings of the car to measure distances, detect road edges, and identify lane markings. Ultrasonic sensors in the wheels detect curbs and other vehicles when parking.
Sophisticated software then processes all of these inputs, plots a path, and sends instructions to the car’s actuators, which control acceleration, braking and steering. Hard-coded rules, obstacle-avoidance algorithms, predictive modelling and object recognition help the software follow traffic rules and navigate obstacles.
The reliability and safety of each of these systems must be evaluated extensively for them to be deployed safely on the road. Toyota Motors President Akio Toyoda has said: “Total autonomy will only be 100% accident-free by testing a minimum of 14.2 billion kilometres, which in practical terms, would take decades of real-world driving.” Carmakers cannot afford to wait decades for this data, so they are turning to simulations to test and improve their systems. AI-based simulations shorten the testing period by running thousands of different scenarios simultaneously. These simulations also improve safety and highlight issues before they cause problems on a real street.
Gallee says: “Simulation for autonomous driving testing is extremely complex. We need to simulate millions of scenarios with the objective to quickly identify failures and quantify the probability of failure of the overall system.” One obstacle to launching a fully autonomous vehicle, for instance, is achieving 100% confidence in the data-gathering, object-detection and decision-making processes of its systems.
Regardless of the ambient lighting conditions or the weather, cameras must reliably detect pedestrians and other physical objects, and trigger an appropriate reaction from critical systems such as braking. This is especially challenging in rainy, foggy and snowy conditions, which can confuse visual cameras and can affect the performance of LiDAR, radar and other conventional sensor technologies.
Automotive companies can use weather laboratories to create scenarios in controlled environments and on-road testing to evaluate the impact of weather. While a weather laboratory can provide repeatable weather data, it cannot simulate the effects of other vehicles or dynamic conditions on the road. On-road testing exposes autonomous systems to real weather conditions, but it is painfully slow.
Ansys is working to develop an effective alternative to physical testing by coupling computational fluid dynamics (CFD) and optical simulation techniques. It says that CFD-optical solutions provide an effective way of designing and optimising hardware like sensors, together with the embedded software that controls these sensors. Because simulation can be done early in the design process, it can save time by detecting problems that are harder to solve later, when most of the design has been completed.
Ansys Fluent can be used to performing CFD simulations of various weather conditions, including wind, rain, fog, snow and dust. Additionally, weather-induced sensor soiling, droplet impingement and transition to film flows, fogging and surface condensation, frosting, icing and de-icing phenomena can also be analysed using Fluent. The result of CFD simulations is the high-fidelity, reproducible generation of weather data for optical simulations.
CFD can also help analysts in assessing the weather’s impact on optical sensors and improving their design, performance, packaging and placement on the vehicle. Companies can study sensor layout virtually, searching for the most efficient sensor mix to improve the performance of autonomous vehicle sensors in adverse weather conditions.
The second challenge for developers of AVs concerns regulation, according to Gallee. The homologation of automated and connected vehicles according to global regulations is essential for their safe and reliable development and deployment around the world. Existing regulatory safety frameworks applicable to conventional vehicles and their components, however, are insufficient to fully assess the operational characteristics of current and future AV technologies. How, for instance, can regulation take into account the frequent software updates that AVs will likely require? Gallery says: “It is not the driver’s job to test the new Beta version of an automatic software pilot when it is released. Simulation and virtual testing will be required to enable the on-the-fly homologation of a new version of automated driving software.”
Ansys is pulling all of this work together through a partnership with the BMW Group to create an end-to- end tool chain specifically guided by safety principles to develop and validate advanced driver-assistance systems (ADAS) and automated/autonomous driving functions. Through this collaboration, the BMW Group hopes to be one of the first automotive manufacturers to offer Level 3 automated driving to consumers. The BMW Group will use Ansys software solutions as part of its tool chain to define test plans, pilot its execution, and gather and compile data-critical system information. Ansys says that, using specific algorithms, the software efficiently and automatically searches for the most robust design configuration to help make critical decisions early in the design process– reducing development time and overall project costs.
Gallee says: “Ansys is helping customers by providing the major building blocks of a persuasive toolchain for the validation of AV software stacks. This includes dealing with millions of simulated scenarios in the cloud and providing the continuous measurement and traceability of safety data throughout the development process, up to the homologation stages.”
As we have seen, the automotive industry is in a period of significant disruption. If there is one constant, however, it will be the reliance of automotive engineers on advanced simulation technologies to help them navigate the road ahead.
There are six distinct levels of automation and, as the levels increase, the extent to which the car can drive itself increases. The levels are:
• Level 0, where the human driver is always in complete control;
• Level 1, where the vehicle’s advanced driver-assistance system (ADAS) can support the human driver with either steering, accelerating and braking;
• Level 2, where the ADAS can oversee steering, and accelerating and braking in some conditions, although the human driver must pay complete attention throughout the journey, while also performing the remainder of the necessary tasks;
• Level 3, the advanced driving system (ADS) can perform all parts of the driving task in certain conditions, but the human driver must be able to regain control when requested to do so. In other conditions, the human driver executes the necessary tasks;
• Level 4, where the vehicle’s ADS can perform all driving tasks independently in certain conditions, during which human attention is not required;
• Level 5, where the vehicle’s ADS can perform all tasks in all conditions, and no assistance is required from the human driver. This full automation will be enabled by the application of 5G technology, allowing vehicles to communicate not just with one another, but also with traffic lights, signage and even the roads themselves