- » View All
Tips For Handling High-Speed Inspection Applications
by Winn Hardin, Contributing Editor - AIA Posted 11/17/2011
Machine vision suppliers and integrators have made an industry out of solving customer’s high-speed bottling, web inspection, and other applications, but what does high speed mean, and how do system designers solve these applications?
Before we tackle the question of designing a high-speed machine vision system, it helps to start with a definition of what “high speed” means. Most people think of high speed as an object or production line that processes hundreds of units per minute. The products can look like a blur to the human eye moving along a production line. But what really makes these applications “fast” isn’t the speed of the production line, but the processing requirements the production line places on the machine vision system.
First, let’s start with a general description of high speed. Any machine vision system that must run at rates above 30 fps can be considered a high-speed application, according to Luc Nocente, President and Founder of NorPix (Montreal, Quebec, Canada), provider of high-speed video recording software used for monitoring and troubleshooting machine vision applications.
Time is also a major consideration. For Advenovation (Brighton, Michigan) President, Adil Shafi, acquiring and processing an image in less than 20ms translates to high speed, implying frame rates ranging from dozens to hundreds of fps, but he adds that it’s not just about frame rates. “The unique considerations of high speed machine vision systems include careful design of multiple aspects of the system, from image capture and digitization, thorough analysis, reporting and decision making commands or data for downstream action,” explains Shafi.
Teledyne DALSA’s (Waterloo, Ontario, Canada) Product Manager, Inder Kohli, takes all of Shafi’s design factors and simplifies them by comparing them to the most common computational platform: the PC. “An application’s ‘speed’ is really relative to the computing platform and its ability to process the data stream,” says Kohli. “The definition I propose is to think of high speed not as a static process, but a dynamic process that changes with time. If a normal PC is not able to keep up with the camera frame or data rate, then I refer to that as high speed. Back in the day, a PC couldn't handle 100 MB/s, so anything exceeding that threshold was high speed. Then PCI came around and 120 MB/s was the limit. Now, PCI Express can go 4 GB/s. But transfer isn’t the only definition of high speed. You have to be able to process all that data. Now, with 1-4 GB/s coming in, you’re forced to find other ways to manage the processing that usually involves a frame grabber and careful consideration of the camera.”
Bottling and web inspections are two of the most common examples of high-speed applications, and NorPix has worked on both.
“We can record image data at up to 15,000 fps in software and from multiple cameras,” explains Nocente. “But troubleshooting a high-speed application takes more than looking at images. High-speed processes are high dependent on associated systems in the production environment, which is why our system doesn’t just store time-stamped images, but any relevant sensor or equipment data, such as the camera’s temperature, gain and other common camera functions and settings, trigger signals with time stamps, and anything else the customer needs to solve the problem using their I/O and National Instrument’s DAQ modules as necessary.”
NorPix’s StreamPix5 and TroublePix have both been used recently to help a paper manufacturer understand why an automatic rolling cutting and replacement device wasn’t working properly during transfers, and why a French bottling line was ejecting bottles when running at maximum speed.
But as our working definition states, dynamic processes can pose the same high-speed challenges as web or bottling line. “Real time robot to vision coordination is a good example of a system that doesn’t always have to work fast, but does have to process images very quickly at certain points,” explains Advenovation’s Shafi. “One of the most famous examples of these is the University of Tokyo’s Ishikawa Oku Lab. The vision guided robot can work with a ball extremely quickly, and the system uses a custom vision chip to do a lot of the traditional software functions in hardware that would normally be done by a PC host or other CPU. This is what makes the system so fast.”
The flat-panel industry is another application where product design changes are increasing demands on a machine vision processing system.
“Pixels on the flat panels are getting smaller, as are defect sizes, while tap and testing times remain constant or even get shorter to increase the line’s productivity,” says Teledyne DALSA’s Kohli. “Another element that adds amount of data that needs to be processed is color. Color was often not used because it taxed data throughput. Now we see very high-speed cameras that are color. Before, a 72 Hz, 4k, or 8k line-scan camera would be considered high speed. Now, you can get RGB cameras at those line rates, generating three times the data. Teledyne DALSA sells cameras like these into many different industries, including food processing, flat panels, and electronics. There is a large and growing industrial base driving high-speed machine vision throughput.”
Solving the High-Speed Problem
As Shafi alluded to earlier, solving a high-speed machine vision application requires careful consideration of each hard and soft component, as well as the interfaces in between.
“As the part moves faster, or your defects get smaller, the resolution of all the mechanical systems becomes more complicated and important, particularly encoders, because the accuracy of the image is directly related to the accuracy of the encoder information” explains Kohli. “You have to consider all the mechanical aspects of the encoder: backlash, motion jitter, and other elements that affect image quality.”
One might expect that separate I/O built just for encoders would be the best answer, but Kohli suggests that frame grabbers, such as Teledyne DALSA’s Xcelera line, are often better solutions because they offer tighter integration with fewer intermediary interfaces between the image data and encoders or trigger signals.
“Cameras have their own challenges at high speeds, such as heat dissipation and noise,” Kohli concludes. “High speed processing in smart cameras requires careful division of processing in hardware and software by streamlining and reducing data with hardware assisted processing.”
While high-speed machine vision challenges are not easy to solve – and likely well beyond engineers not steeped in vision technology, design, and expertise – the good news is that machine vision components and software and design expertise are available to solve most, if not all, of industry’s needs when it comes to high-speed data processing solutions.
There are currently no comments for this article.
Leave a Comment:
All fields are required, but only your name and comment will be visible (email addresses are kept confidential). Comments are moderated and will not appear immediately. Please no link dropping, no keywords or domains as names; do not spam, and please do not advertise.