Highly Automated Driving Demands Sensor Fusion

May 25, 2016 // By Uwe Westmeyer, Renesas Electronics Europe
ADAS (Advanced Driver Assistance System) functions are increasingly being implemented into motor vehicles and some of them such as the adaptive cruise control assistant and the lane departure warning assistant can already be found in sub-premium class vehicles. But this is just the beginning - over the next decade, the highly automated driving mode will also become reality.

In current implementations, there are solutions, in which a particular type of sensor provides data for a specific function. More complex functions such as traffic jam assistance systems, however, require shared use of multiple sensors of different types and the associated links. The link between sensors and functions is becoming multidimensional. Given the increasing number of sensors and the ever more complex functions for a highly automated driving mode, one needs a centralised environment model as the basis for different ADAS functions. This model is based on loads of sensor data and their assessment builds the basis for actions that are real-time interventions in the road traffic and thus require hardware and software implementations with a high level of performance, operational safety and low power consumption.

In the process of creating a centralised environment model different types of sensors are used, which provide redundant and supplementary information thereby reducing the dependence upon individual sensors, for example, cameras that cannot achieve the specified detection rate or report objects where none is there, so called “false-positive detection”. Supplementary sensors, in this case, could be Radar or Lidar.

To merge these sensor data into a unified view, one requires massive computing power. Sensors like cameras or Radar can transfer data at a bandwidth from 10 Mb/s to 40 Mb/s, which quickly adds up to 500 Mb/s of net data in a comprehensive sensor system. This data must be fed through the vehicle network and collected by an ECU in real time. For this, the ECU must have access to a Gigabit Ethernet including the corresponding switches with automotive reliability and quality level.

For a correct representation of the surroundings, the ECU must not only receive the data in real time but also process it. Since the sensors may not detect all objects at the same time, the data must be chronologically synchronised with one another; at the same time, since each sensor is

Design category: