- » View All
Autonomous Car Industry Comes Knocking on Machine Vision’s Front Door
by Winn Hardin, Contributing Editor - AIA Posted 08/25/2016
A year ago, many in the machine vision industry didn’t see a viable path for industrial imaging technology in the nascent autonomous vehicle market. Today, phone calls and inquiries from automotive OEMs, tier 1s and tier 2s are on the rise, insiders say, as the automotive industry works to develop low-latency imaging networks along with a slew of other sensing modalities for autonomous cars.
But the machine vision industry may not have to wait 10 or 15 years until autonomous vehicles are expected to truly enter the commercial market. According to Niall Bolster, Regional Sales Manager - EMEA at Pleora Technologies (Kanata, Ont., Canada), the ongoing movement from passive safety systems intended to protect passengers after a crash to active safety systems designed to prevent a crash in the first place could open big opportunities for the machine vision sector. “Automotive Advanced Driver Assistance Systems [ADAS] are finding increased use in the automotive industry,” Bolster says. “It’s the driving force behind what’s happening in today’s auto industry.”
Self-Driving Cars and Machine Vision
When it comes to safe vehicles, it’s hard to think of one safer than a U.S. Army tank.
Engineers at Pleora had already developed video interface solutions to connect imaging sources and displays panels for real-time closed-hatch driving systems for military tanks when the autonomous vehicle craze started to gain momentum. Pleora’s Bolster was tasked with putting together a market investigation team. Bolster joined the machine vision industry in 2007 having spent the previous 15 years working in passive safety systems for the auto industry.
By coincidence, it also was around 2007 that the auto industry began to shift from passive safety systems to active systems. These are intended to protect the vehicle and its passengers before the crash happens.
Bolster says the next generation of ADAS will likely use multiple imaging devices, including sensors, cameras, LIDAR, and adar. These devices will be tasked with monitoring and eventually controlling everything from lane departure to parking.
A major challenge to reach that goal, however, are the power and size constraints that need to be addressed, particularly cabling, which can add considerable weight to a vehicle and negatively impact its fuel efficiency.
"We think Ethernet is coming to the automotive market in a more significant way,” Bolster says. But car manufacturers are accustomed to their own systems and ways of doing things. For example, many driver assistance applications today rely are point-to-point systems, such as a camera mounted at the front of the vehicle for lane departure warning. “Part of the challenge for automotive designers is understanding how to migrate from well understood point-to-point systems for basic driver assistance to a more complex multicasting environment that networks multiple imaging sources and processing systems for more autonomous applications,” Bolster says.
But more and more, automotive industry players are realizing that a better solution could include a low-latency imaging network in the vehicle, providing a backbone for operational data from multiple sensors and cameras.
“So from that respect, we think ADAS applications will rely on networking multiple imaging sources, with centralized processing system that can analyze and share data to various points in the vehicle,” he says. And that plays to a core strength of the machine vision industry.
Standards in Development
Advanced driver assistance systems rank among the fastest-growing segments in automotive electronics. Vehicular safety systems are covered in part by ISO 26262 functional safety standard, while technology specific standards, such as IEEE P2020 for image quality and communications protocols such as the Vehicle Information API, are under development.
“Some standards are being established and we’re participating in their development,” says Pleora’s Bolster. “The P2020 standard is now at a stage in development where it’s getting feedback on drafts from players within the industry and outside.”
While standards drive OEMs to adopt new safety or operational technologies, another driver is the consumer. For example, while standards require certain minimal safety systems for automobiles, consumer demand for higher impact ratings helped convince OEMs to invest more in safety-related system development. “Consumer awareness around impact ratings, for example, really drove OEMs to install passive safety systems,” Bolster says. The next step will be to push safety ratings to include active safety systems.
“Part of our research brings us to machine learning,” he says. And machine learning will play a larger role in ADAS and autonomous driving. For example, many drivers will get used to the vehicle taking over. But in a situation where the vehicle can’t handle all the parameters and hands control back to the driver, will people be able to respond to prevent a crash? “That’s one of the topics that are bandied about.”
The machine vision industry may now see a path forward to becoming integral with the move to more encompassing driver assistance systems and ultimately self-driving vehicles. One hurdle that remains, however, is that machine vision is not well known within the industry. A next step will be to partner up with firms that are well established in automotive and who are looking for expertise in machine vision solutions.
Automotive Manufacturing Hungry for More Machine Vision Solutions
While machine vision applications are at last making inroads into autonomous vehicles, the technology already has a “strong presence” in automotive manufacturing through applications such as 3-D bin picking, 2-D guidance, and vision-guided robotics, says David Dechow, Staff Engineer-Intelligent Robotics/Machine Vision”, FANUC America Corporation (Detroit, Mich.).
He says the automotive industry is pushing machine vision developers to excel in cutting-edge applications in 3D, and more complex multiple cameras and other 3D guidance applications.
“While vision guided robotics is the bread and butter of the automotive industry,” he added that the automotive industry’s interest in implementing machine vision for inspection hasn’t “waned in the slightest.”
A recent case study from, Leoni Engineering, Products & Services (Lake Orion, Michigan) illustrates the point. Struggling with inspection of interlocking injection molded parts for a high-end luxury vehicle, Fischer Automotive Systems, a manufacturer of injection-molded parts for auto interiors, turned to LEONI’s machine vision integrators for help.
The parts were manufactured on multiple injection molding lines, and required exacting dimensional tolerances, repeatable shape and consistency, accurate interlocking features and subcomponents, validation of safety interlocking details, and seamless interaction of moving parts. LEONI constructed a machine vision inspection system from Keyence Corp. (Itasca, Ill.) that was made up of 16 cameras, including some with a resolution of 21 megapixels, according to Andy Reed, Technical Sales, Vision for LEONI. The off-line quality assurance bench is expected to eliminate a significant amount of warranty work.
There are currently no comments for this article.
Leave a Comment:
All fields are required, but only your name and comment will be visible (email addresses are kept confidential). Comments are moderated and will not appear immediately. Please no link dropping, no keywords or domains as names; do not spam, and please do not advertise.