Multi-Sensor Data Fusion (MSDF) for Driverless Cars, An Essential Primer

Lance Eliot
20 min readApr 1, 2019

Dr. Lance B. Eliot, AI Insider

Multiple sensors on self-driving cars requires proper sensor fusion

A crucial element of many AI systems is the capability to undertake Multi-Sensor Data Fusion (MSDF), consisting of collecting together and trying to reconcile, harmonize, integrate, and synthesize the data about the surroundings and environment in which the AI system is operating. Simple stated, the sensors of the AI system are the eyes, ears, and sensory input, while the AI must somehow interpret and assemble the sensory data into a cohesive and usable interpretation of the real world.

If the sensor fusion does a poor job of discerning what’s out there, the AI is essentially blind or misled toward making life-or-death algorithmic decisions. Furthermore, the sensor fusion needs to be performed on a timely basis.

Humans do sensor fusion all the time, in our heads, though we often do not overtly put explicit thought towards our doing so. It just happens, naturally.

The other day, I was driving in the downtown Los Angeles area. There is always an abundance of traffic, including cars, bikes, motorcycles, scooters, and pedestrians that are prone to jaywalking. There is a lot to pay attention to. Is that bicyclist going to stay in the bike lane or decide to veer into the street?

--

--

Lance Eliot
Lance Eliot

Written by Lance Eliot

Dr. Lance B. Eliot is a renowned global expert on AI, successful startup founder, global CIO/CTO, , was a top exec at a major Venture Capital (VC) firm.