How sensor fusion impacts the automotive ecosystem

June 21, 2016 // By Christoph Hammerschmidt
As an indispensible technique to master the challenges of automated driving, sensor fusion requires a capability to process real-time sensor data hitherto unseen in the automotive industry. This situation offers opportunities for startups to provide suitable technologies – and for established players to acquire such startups, says market researcher IHS Technology.

The automotive ecosystem will soon be driven by high-performance solutions for automated driving, finds IHS senior analyst Akilesh Kona. The reason is that in order to enable a vehicle to automate driving functions it needs reliable information about its surroundings. The need for reliable information however translates into the need for different, redundant sensor types.


Typically, Advanced Driver Assistance Systems (ADAS) utilize a dynamic 360-degree life image of its surroundings. This image is computed from data of different sensors – cameras, radar, lidar and, to a lesser extend, ultrasonic sensors. The reason why it is necessary to use different sensors is that each sensor type has its specific limitations. Cameras offer poor images under low-light and unfavorable weather conditions. Radar sensors are affected much less by weather conditions; however, they achieve a relatively poor image quality with low resolution. Lidar sensors offer much better defined images, but they also perform less than stellar when they encounter rain, snow or hail. Blending all these signals together achieves reliable, high-definition images, redundant enough to be used in safety-critical applications such as automated driving.


This situation is favorable for startup companies to contribute their specific expertise. And since the established players wish to remain competitive, they have started to acquire the desired expertise through takeovers, notes IHS. Examples are General Motors having acquired self-driving technology company Cruise Automation, Delphi having taken over Carnegie-Mellon spin-off Ottomatika, and Dura Automotive Systems is collaborating with Green Hills Software to develop sensor fusion modules for automated driving. Also the semiconductor industry is striving to provide the high-performance computing solutions necessary for the demanding task of sensor fusion. Examples are NXPs BlueBox or Mobileye’s EyeQx platforms. In particular its latest iteration EyeQ5. To some extend, chip vendors can revert to their expertise generated in similar designs for consumer markets.


Deep learning techniques and machine vision are regarded as a good way to solve the problems