Making sense of ADAS

Mercedes takes autonomy to the roads of Germany

Jonathan Newell finds out how simulation is providing an answer to the challenge of testing the complex sensor arrangements associated with ADAS.

As vehicles become more connected and aware, with an expectation that they will react to their surroundings and eventually drive themselves, the automotive industry is facing an increasing challenge of ensuring its test regimes are robust enough to account for the escalating number of variables such sensor proliferation represents – including the human behind the wheel.

Driver assistance

ADAS (Advanced Driver Assist Systems) are already commonplace and industry bodies want them to become the norm for all new vehicles on the road. New car assessment bodies such as EuroNCAP are building ADAS into their roadmaps of future safety requirements that will become mandatory in order to achieve the coveted 5-star assessment rating. One such system, Automatic Emergency Braking (AEB) has been part of the test used by Euro NCAP since 2014.

The Australasian New Car Assessment Programme (ANCAP) has had Electronic Stability Control (ESC) as a mandatory requirement since 2011 and AEB since 2013. According to ANCAP requirements, without ESC, any car built since 2016 won’t even achieve a 1-star rating, such is the importance of the technology.

Most people who own a fairly modern car will already be familiar with ESC and AEB without necessarily understanding the complexity of the sensor technology behind them and in the very near future, other ADAS equipment will become similarly commonplace and similarly expected as standard.

Such systems include adaptive cruise control, blind spot elimination, lane keeping assist and automatic parking, all of which rely heavily on the reliable combination of multiple sensor systems.

From ADAS to autonomy

As sensors proliferate and ADAS becomes “joined up”, the possibility exists of the car taking more of the decision making and control tasks away from the driver; that’s the dream. However, the reality demands far more complexity than anything that connected ADAS could possibly offer and the major car manufacturers are in a frantic battle to be the first to offer something that is truly autonomous.

Amongst the leaders in developing autonomous vehicles are Volvo, Ford and Mercedes-Benz, each of which has access to rich infrastructure testing resources. Ford has the University of Michigan Mobile Transformation Centre, Volvo has Gothenburg city council on its side and Mercedes has similar Government cooperation for the use of public infrastructure for testing.

For 2017, Mercedes has received permission from Stuttgart regional council to perform driverless vehicle testing on its roads. Such decisions aren’t made trivially and the granting of permission to use public roads was based on the outcome of testing that the German car giant had been making since 2011 and which had resulting in the latest software platform for autonomous control, DAVOS (Daimler Autonomous Vehicle Operating System).

This powerful software employs high power graphic processing units and deep learning technology, the purpose of which is to make sense of all the inputs from cameras, Radar, LiDAR, IR and other sensors and make robust decisions from them.

Coping with complexity

As complexity increases, the possible outcomes of sensor condition combinations, failure modes and driver responses increase alarmingly and there is a heightened burden on the designers and test engineers to predict the way the system will operate in all possible circumstances, something which with a potentially infinite set of possibilities becomes an impossible task.

Statistical analysis in the early design phases, software simulation and ultimately, real-world hardware testing all play a role in validating such complex designs and Hardware-in-the-Loop (HIL) simulation has now become prominent in achieving this with reduced lead times.

To understand more about simulation in ADAS and autonomous vehicle testing, I spoke to Ansible Motion’s Technical Liaison, Phil Morse.

According to an article Morse wrote recently, Driver-in-the-Loop (DIL) simulators provide a means for developers to take account of certain safety aspects of the design and the Human Machine Interface (HMI) requirements. Vehicles at all levels of autonomy still have a human at the wheel and the HMI is all important in passing control to and from the on-board computers and for regaining control in critical circumstances such as sensor failures.

Rather than waiting until final hardware testing when a human can be placed at the controls to experience the safety envelope of the car on test circuits, DIL simulation enables both human and vehicle responses to be studied in extreme situations. This offers a safer environment for the tester as well as providing the ability to test a wider scope of failure modes.

I asked Morse how sensor reliability is built into the overall approach to advanced driver assistance  testing. He  explained that “virtual cars” can be constructed, which take the form of Driver-in-the-Loop (DIL) simulators in which large amounts of data are pushed around.

“A whole chain of events requires reliability from measurement collection to the fusion of these sensor outputs with other measurements, to data gathering and monitoring.  A number of things can corrupt this process such as electrical noise or physical faults and things will inevitably go wrong at times,” Morse said. “Robust systems are developed by gathering and fusing data from multiple, simultaneous sources, using logic to set confidence thresholds, logging and/or alerting when things do not look quite right and, perhaps most importantly including failsafe redundancies, so that when faults do occur, there are appropriate safety layers in place.”

The design of the testing regime is complex and as implementation and inter-system complexities increase, there comes a point where physical testing cannot cover all possible combinations.  So typically, the approach is to employ statistical methodologies such as Design of Experiments (DOE) to predict cause-and-effect relationships for relevant factors and variables.

“Off-line simulation is used to create maps and response surfaces that describe complex behaviours.  Then a set of carefully selected “checkpoints” on these maps can be confirmed by physical testing. This level of testing is somewhere south of infinity so it can actually be covered in the available time,” continued Morse.

DIL simulation in context

DIL simulation is one part of a whole series of tests and simulation exercises the manufacturers need to go through to assure their products and is part of a progressive series of steps that take an increasing approach to realism.

As Morse explained, there are usually three simultaneously occurring layers of testing. The first is to connect actual hardware components and production-deployable computer code with simulations where possible. This is HIL and Software-in-the-Loop (SIL) simulation. The next is to connect real test drivers with off-line, HIL, and/or SIL simulation. This is DIL simulation. The third step is to connect real test drivers with real pre-production full prototype vehicles – everything-in-the-loop.

“The aim of DIL simulation is to get real people connected with imagined systems early and often in the vehicle development process,” he said.

Looking to the future, in the real world of vehicle production, DIL simulation will bring a vital competitive edge to its industry users.

“DIL simulator experiments can became extremely valuable – not because the virtual test driving eliminates the need to perform the physical testing completely – but because testing in the DIL lab enables its users to eliminate all the usual what-ifs and time-eaters that would otherwise consume tight testing windows,” concluded Morse.

Jonathan Newell

Jonathan Newell

Jonathan Newell is a graduate of Loughborough University and has three decades of experience in engineering as well as broadcast and technical journalism.
Jonathan Newell

Latest posts by Jonathan Newell (see all)

About Jonathan Newell

Jonathan Newell is a graduate of Loughborough University and has three decades of experience in engineering as well as broadcast and technical journalism.

Related news

Read More News From Ansible Motion:

Leave a Reply

Your email address will not be published. Required fields are marked *

20 − 5 =