Pixel Club: Understanding Scene Semantics from Vehicles

Idan Geller and Kobi Bentolila (Mobileye)
Tuesday, 20.11.2018, 11:30
Room 337 Taub Bld.

Going from driving assistance to autonomous driving, requires a deeper understanding of the surroundings of the vehicle. Driving assistance systems provide technological solutions that help with the driving process, while these systems usually provide multiple features that enhance driving safety (such as automate lighting, adaptive cruise control, collision avoidance, etc.), they are still very simplistic with respect to their level of scene understanding. Autonomous driving, raises much more complex problems: Is the detected person a police-man that signals us to stop, is there an ambulance that signals our car to move aside? Is the detected person talking over the phone, while trying to cross the road? Common tasks, such as vehicles detection, enjoy the existence of large datasets making it easier to achieve top recognition performance. Unfortunately, this is not the case for the above-mentioned recognition challenges. In our lecture, we will present Mobileyes’ new algorithmic group, which is the first to be operating in Haifa. We will talk about its vast scope of algorithmic problems and further focus on one aspect of our work, describing innovative ways to prevent overfitting when training from a moderate amount of data (i.e, structuring and using intermediate classifiers, using heat maps as intermediate layers and self-performance evaluation by a neural network).

Back to the index of events