Member Login

Embedded Vision Systems to Combat Fatigue & Human Error

Embedded Vision Systems to Combat Fatigue & Human Error The world is watching and waiting for fully autonomous vehicles to start transporting people with no input from a driver. Until then, technology is still working to improve driver safety. In 2020, the Euro NCAP (New Car Assignment Program) will require any manufacturer wishing to have a 5-star safety rating to install a driver monitoring system. These systems integrate deep learning and embedded vision to understand the state of the driver.

Human error and fatigue are often to blame for collisions. Embedded vision systems are being used to help eliminate these risks caused by drivers. Vision systems range from single cameras to multiple vision and biometric sensors that judge whether a driver is ready to take the wheel.

Embedded Vision Systems Adapt

Embedded vision systems detect data such as the driver’s gaze, pupil dilation, eye openness and head position. Then, in real time, the system must use all those factors to gauge drowsiness or attention. If the driver isn’t paying attention, the system can beep or vibrate the steering wheel to get a driver’s attention back on the road.

This analysis isn’t just based on a single image. Advanced algorithms detect if a driver is exhibiting specific behaviors that could lead to sleep or error. The computer vision algorithms differentiate between a wide range of faces and expressions. Since drivers also show signs of fatigue differently, this requires extensive deep learning so embedded vision systems can make intelligent predictions and provide accurate feedback to the driver.

Systems must also adapt on the fly. The emotional state of the driver can change and affect analysis of the images. Also, drivers may put on glasses, hats, or other items that affect facial recognition. The system must be able to adjust to low-light conditions, sometimes rapidly due to changing weather or passing through tunnels. The system also can’t be compromised by bumpy roads or other changing road conditions.

Added Benefits from Computer Vision

Embedded vision technology has other side benefits besides alerting drivers to sleepiness. Using the same system, a car’s computer can understand how to more safely deploy an airbag. The features aren’t limited to just safety. Computer vision systems can be used to automatically adjust a driver’s seat and mirrors or select their driver’s favorite playlist.

Similar technology is being developed for the skies to help detect drowsiness among pilots. A safety kit being developed includes smart cameras and wearable electronics to keep those aboard planes safe in the skies. The addition of physiological parameters adds an additional layer of safety and gathers data to improve the system.

To meet the demands for low processing power, real-time analysis, and to maintain user privacy by making use of edge computing, 2D infrared sensors are often used in vehicle computer vision systems. But robust 3D systems with enhanced capabilities are also available and are improving in the areas where 2D systems currently have an edge.

Learn more about the transformative role of Embedded Vision in the Automotive Industry by visiting our embedded vision informational section at Vision Online.

Search AIA:


Browse by Products:


Browse by Company Type: