Post

Unlocking the Potential of In-cabin Solutions: Emerging Use Cases & Evolving Data Needs

March 19, 2024

Despite the advancements in driver-assistance systems (ADAS), human supervision remains indispensable for ensuring the safety of self-driving vehicles. In-cabin monitoring systems equipped with sophisticated cameras and sensors emerge as crucial tools, complementing ADAS to ensure passenger safety, especially among unpredictable demographics. 

The global market for In-cabin Monitoring AI was at USD 112 Mn in 2022 and will reach $3,215 Mn by 2031, growing at a CAGR of 45.21% from 2023 to 2031. Various factors, like safety, regulatory needs, changing customer preferences, and others, are behind this strong growth.

For example, in 2020, the European New Car Assessment Programme (Euro NCAP) announced that Driver Monitoring Systems would be required for vehicles to achieve a 5-star safety rating starting in 2022. 

Moreover, numerous surveys conducted by automotive manufacturers reveal elevated satisfaction levels among initial adopters of in-cabin monitoring systems. Lexus, for example, reported a 94% satisfaction rate among customers who purchased vehicles equipped with its Driver Monitoring System.

These statistics underscore the growing significance of in-cabin monitoring systems in enhancing safety, convenience, and regulatory compliance in the automotive industry. Let us delve into the in-cabin sensing systems and dissect the critical data prerequisites for optimal functionality.

Types of In-cabin Monitoring Systems

While there are two broad classifications, Driver Monitoring Systems (DMS) and Occupant Monitoring Systems (OMS), several in-cabin monitoring systems are designed for specific purposes and employ different technologies. 

Camera-based Systems

These systems utilize cameras strategically placed within the cabin to capture images and videos of drivers and occupants. They can detect driver attentiveness, monitor passenger behavior, and provide visual data for analysis.

Sensor-based Systems

Sensor-based monitoring systems rely on various sensors, such as pressure sensors in seats or steering wheels, to detect occupant presence and movements. These sensors can also monitor environmental conditions like temperature and air quality.

Biometric Systems

Biometric in-cabin monitoring systems use biometric data such as facial recognition, fingerprint scanning, or voice analysis to identify occupants and personalize their experience. They can also monitor physiological indicators like heart rate and stress levels.

Gesture Recognition Systems

Gesture recognition systems track hand movements and gestures made by occupants to control various functions within the cabin, such as adjusting the audio volume or activating voice commands.

AI-powered Systems

These systems utilize artificial intelligence (AI) algorithms to analyze data collected from cameras, sensors, and other sources. They can interpret complex behavior patterns, predict occupant needs, and provide proactive safety alerts.

Integrated Systems

Integrated in-cabin monitoring systems combine multiple technologies, such as cameras, sensors, and AI, to provide comprehensive monitoring and control capabilities. These systems offer enhanced accuracy and reliability by leveraging the strengths of each component.

Privacy-focused Systems

With increasing concerns about data privacy, some in-cabin monitoring systems are designed with privacy-enhancing features such as anonymization techniques, encryption protocols, and user consent mechanisms to protect the personal information of occupants.

In-cabin Monitoring Use Cases

To successfully train AI and machine learning models tailored to each use case, overcoming data challenges specific to in-cabin sensing/monitoring systems is essential. We’ll delve into data requirements in the following section, but for now, let’s explore some use cases.

Driver Monitoring – Safety and Driver Assistance

The in-cabin monitoring systems track factors like the activities and the status of the operating driver, including:

  • Eye gaze and facial expressions to detect fatigue, tension, drowsiness, distraction, or even potential medical emergencies
  • Factors such as driver’s posture, head position, and sand seatbelt detection

Different type of data like gaze detection data, facial recognition data, head position, and movement data is required to monitor:

  • Where is the driver looking?
  • Is the driver on call or eating or drinking while driving?
  • Is the driver fixated on a specific position, indicating fatigue or medical emergencies such as a stroke?
  • Is the driver constantly looking away from the road?
Occupant Monitoring – Passenger Safety and Convenience

In-cabin monitoring systems serve not only the drivers but also ensure the safety of passengers when vulnerable groups like elders, children, or even pets are riding in a self-driving vehicle. Occupant monitoring systems monitor:

  • Passengers status, movement, posture, seatbelt detection, and other conditions
  • Eye gaze and facial expressions to detect their behavior and potential medical emergencies

For monitoring, the system requires cabin occupancy data and classification of the occupants, allowing it to train AI and ML models and customize its safety responses.

  • How many passengers are seated inside the car? Is it more than the seating capacity?
  • Who are the occupants? Whether children, elderly, or pets are seated?
  • If any passengers or children sticking their hands or heads out of the window?
  • Posture monitoring to detect if someone is left behind (e.g., a child sleeping) or in a safe position during sudden stops. 
  • Detect if everyone has buckled up properly or not.

In-cabin Monitoring – The Road Ahead

In-cabin monitoring systems rely heavily on deep learning models, which require vast data for training to address diverse use cases. Each use case, from driver monitoring to passenger safety, might necessitate a unique data set to overcome specific challenges.

However, generating this data presents significant hurdles. First, creating a massive dataset involving thousands of images raises global privacy concerns. Second, ensuring the data is diverse enough to avoid bias and accurately represent real-world scenarios (ground truth) is crucial.

In-cabin monitoring has the potential to go beyond its safety features. Can you imagine a car that adjusts its cabin temperature based on personalized information or for the comfort of the occupants? In such cases, the system requires data such as:

  • The temperature and body movement data to adjust climate control for individual passengers.
  • Facial recognition or fingerprint scanners for personalized settings. Once an individual enters the car. 

So the real question is how do you train your in-cabin monitoring system, overcome data challenges, and fulfill different data requirements that can combat different issues? We must find ways to create a robust dataset for various functionalities while maintaining a balance between functionality and the privacy of drivers and occupants.

Evolving Data Needs of In-cabin Monitoring and Data Annotation

As in-cabin monitoring systems advance, the data needs and requirements for effective operation are also evolving. Beyond simply capturing images or sensor readings, these systems now require more sophisticated data annotation techniques to extract meaningful insights. 

  • Annotating data involves labeling or tagging specific attributes within the collected data, such as identifying occupants, recognizing facial expressions, or classifying behaviors. 
  • This annotated data is the foundation for training AI and machine learning models to accurately interpret and respond to real-time events within the cabin environment. 
  • As in-cabin monitoring systems become more integrated with other vehicle systems and external data sources (such as GPS or weather data), the demand for diverse and high-quality annotated data increases. 
  • Ensuring accuracy, consistency, and diversity of annotated data is critical for robust in-cabin monitoring solutions that enhance safety, comfort, and overall passenger experience in autonomous vehicles.

In-cabin Monitoring Data Annotation Technology Powered by the iMerit Ango Hub

iMerit tackles a critical challenge of high-quality data annotations in developing Level 3 and Level 4 autonomous vehicles to ensure the performance and safety of these systems. Our in-cabin monitoring data annotation technology leverages the iMerit Ango Hub platform, powered by predictive AI models, to accelerate the in-cabin sensing data labeling process. On top of that, a dedicated team of human annotators actively reviews the data, acting as a crucial layer of oversight. This combined approach ensures the data captures even the most challenging scenarios (edge cases) for robust system training.

Learn more about our in-cabin monitoring data annotation solution: https://imerit.net/in-cabin-monitoring-data-annotation-technology-automation/

Are you looking for data annotation to advance your project? Contact us today.