Post

Powering Driverless Cars with Data Annotation

April 06, 2021

Today, a majority of automobile manufacturers (Tesla, GM, Ford, BMW, Toyota, PSA, Renault-Nissan) and companies such as Waymo, Uber, NuTonomy envision a future with robotic vehicles driving around without any human presence being necessary. It is estimated that in 2025, the installation rate of AI-based systems in new cars should increase by 109%. Here are some of the top features of an autonomous car and the data annotation required to power them.

Vehicle In-Car Monitoring

In-cabin monitoring has gained wide popularity and has seen significant advancement. In simple terms in-cabin monitoring is the placement of cameras and vision systems internally to monitor the driver and other occupants. In-Car sensors can enable monitoring of vehicle occupants for levels of drowsiness and distraction by observing head and body position as well as eye gaze.

Vehicle-In-Car-Monitoring

Example – The Mood Detector in Jaguar Land Rover identifies the smallest variations in the driver’s facial expressions and interacts in real-time to optimize the parameters of their comfort. 

Data Annotation: Keypoint annotation for detecting driver distraction and monitoring In cabin behaviour

Road Damage Detection

A key focus area that has captured the attention of automakers and safety regulators is improving road safety. Previous works on road damage identification have tackled the detection and classification of individual types and classes, and only very recently, some works have addressed the problem of detecting multiple classes and instances in real time, which usually requires the use of modern Deep Learning (DL)-based detectors or semantic segmentation schemes.

Data Annotation: Semantic Segmentation, Thermal Image annotation for detecting Road damages

In-Car Voice Assistants

Car manufacturers observed the trend of using mobile devices for navigation as an opportunity to improve passenger experience and started integrating sophisticated voice-enabled assistants in their vehicles. Voicebot’s report revealed a 13.7% growth in users of in-car voice assistants from September 2018 to January 2020. As voice assistant adoption continues to climb, so does the need for good quality training datasets to allow a smooth experience.

In-Car-Voice-Assistants

Example – The new Lamborghini Huracán Evo leverages Alexa to control environmental settings including air conditioning, heater, fan speed, temperature, seat heaters, defroster and air flow direction, as well as lighting. The voice assistant’s AI can intuit what people mean from less direct requests too, turning on heat or AC when the driver says they are too hot or cold, for instance.

Data Annotation: NLP Services

In-Vehicle-Infotainment-Systems

In-Vehicle Infotainment Systems

Car infotainment systems are able to connect with smart technologies such as smartphones, telematics devices, sensors, and more to provide a personalized experience and predictive services to drivers and passengers alike. In 2016, the in-vehicle infotainment systems market generated revenue of 33.78 billion U.S. dollars worldwide, a figure which is forecast to increase to over 52.2 billion U.S. dollars by 2022.

Example – Nuance Communication has announced that its Dragon Drive artificial intelligence (AI) platform for the connected car now features expanded conversational and cognitive capabilities for its Automotive Assistant to provide everyone in the car – drivers and passengers – with the ability to ask for navigation, music, content and other in-car features just by speaking – no wake-up phrase or button press required.

Data Annotation: NLP Services

HD Maps for Localization 

By identifying a vehicle’s exact position in its environment, localization is a critical prerequisite for effective decisions about where and how to navigate. HD Mapping uses onboard sensors (including GPS) to compare an AV’s perceived environment with corresponding HD maps. It provides a reference point the vehicle can use to identify, on a very precise level, exactly where it is located (including lane information) and what direction it’s heading toward.

HD-Maps-for-Localization

Example – HERE HD Live Map – uses machine learning to validate map data against the real world in real time.

Data Annotation: Lidar and Radar Annotation for object detection

Advanced-Emergency-Braking-Systems

Advanced Emergency Braking Systems

Advanced safety systems using AI are being delivered in cars today, whether the customer asks for them or not. This safety feature is essential for fully autonomous vehicles as it automatically stops the vehicle to avoid a collision. A research conducted by General Motors on Advanced Driver Assistance Systems with the University of Michigan Transportation Research Institute, showed that Automatic Emergency Braking (or Forward Automatic Braking) with Forward Collision Alert reduced rear-end striking crashes by 46%.

Data Annotation: Annotation on vehicles, pedestrians and objects on road