Member Login

A Quick Update on Vision Systems in Advanced Driver Assistance Systems (ADAS)

A Quick Update on Vision Systems in Advanced Driver Assistance Systems (ADAS)Advanced driver assistance systems (ADAS) in vehicles are relatively new but evolving quickly. They deploy a range of sensors that collect information about the vehicle’s surroundings, allow it to react to certain situations autonomously. Vision systems in ADAS go beyond LiDAR or radar sensors in providing vehicles critical contextual clues about the surrounding environment.

Most new vehicles include some form of ADAS and vision systems are playing an important role in creating more advanced ADAS, leading to higher degrees of autonomy in navigation and operation.

Components of Modern ADAS Vision Systems

Vision-based ADAS systems in modern vehicles are typically comprised of two core components – a camera module, or embedded vision system, and ADAS algorithms. The embedded vision systems are typically comprised of an image sensor, some form of image processing, a cable, and a lens module.

These compact cameras aren’t housed like traditional machine vision systems and are built specifically to integrate with other systems within the vehicle. Oftentimes, these vision systems are based around complementary metal oxide semiconductor (CMOS) sensors that can achieve high resolution at fast frame rates for accurate image capture and transfer of image data. The key feature here is electronic global shutters capable of image capture at high speeds. Other forms of sensors are often too slow, leveraging rolling shutter systems, that distort images at high speeds.

ADAS algorithms are an equally important aspect of embedded vision systems. Most ADAS systems are capable of not only capturing images but understanding, to some degree, the context of the images being captured, which translates into autonomous reactions. These algorithms are specifically developed to recognize traffic lanes, signs, and potential sources of danger, among many other things.

Advanced ADAS Applications Enabled by Embedded Vision

As mentioned, vision-based ADAS systems provide important information that LiDAR and radar sensors cannot. This allows for far more advanced ADAS functionality. Common applications include:

  • Road sign detection
  • Pedestrian detection
  • Vehicle detection
  • Vehicle localization
  • Traffic lane detection
  • Emergency braking systems
  • Forward collision warning systems

These are among the most common ADAS applications enabled by advanced vision systems, though there are many other ways in which ADAS can be deployed.

Embedded vision systems are playing a critical role in the development of advanced ADAS, as well as the end goal of achieving fully autonomous vehicles, by delivering clear and accurate image data for complex ADAS algorithms to understand and respond to.

To learn more on this topic, register for our free webinar, “ADAS Systems & Autonomous Vehicles: The State of the Industry.

Search AIA:

Browse by Products:

Browse by Company Type: