- This article is filed under:
- » View All
Physics Knowledge Helps Infrared Vision Be All It Can Be
by Winn Hardin, Contributing Editor - AIA Posted 09/12/2007
This is Part 1 of 2 in a series of articles on Infrared (IR) imaging. This article examines how integrators are using advanced filtering systems and contour geometry to enhance industrial IR inspection systems. Part II will look at the growing relationship between security and machine vision as illustrated by new data fusion, global coordinate registration, and visualization techniques that build on similar techniques used in robot vision.
Many machine vision systems that use infrared (IR) sensitive cameras follow a similar formula as a standard visible vision camera. A light source shines on an object, light is reflected, and a camera collects the reflected light. The image is then sent to a host and image-processing algorithms extract data meaningful to a quality assurance routine.
But that only works with silicon and indium gallium arsenide (InGaAs) short wave or near infrared (SWIR, NIR) sensors that – because of their electromagnetic wavelengths - don’t need special optics. These NIR and SWIR cameras typically operate similar to a visible camera. There are three more infrared spectral bands that don’t work with glass optics: namely mid-wave, long-wave and far infrared (MWIR, LWIR, and FIR, respectively), but systems built on these cameras have their own unique methods of operation and advantages for security applications.
And beyond security, there are literally thousands of industrial infrared imaging applications that can reveal objects, defects, and imperfections beyond the capabilities of visible imaging – but only if you know how to set up the system, understand your physics related to spectral bands and material responses, and follow the latest trends in computing technology.
Luckily, the Automated Imaging Association (AIA) is distilling the latest IR vision knowledge by combining information from multiple presentations supplemented by additional interviews and discussions on thermal imaging from their recent International Robots & Vision Show. So if you work in glass, plastics, semiconductors, security, thin films, or a dozen other industries that commonly depend on vision systems, read on.
Making the Most of your IR Band
Any discussion of IR imaging has to start with the spectral basics. Running from approximately 700 nm to 30,000 nm (30 microns) and beyond, the IR spectrum is broken down into five bands: near infrared (NIR) from 700 to 1000 nm (1 micron); short-wave infrared (SWIR) from 1 micron to 2.5 microns; mid-wave infrared (MWIR) from 3 to 5 microns; long-wave infrared (LWIR) from 7 to 14 microns; and far infrared from 15 to 30 microns. Sensors of different materials and structures detect each spectral band. Standard silicon used in CCD cameras is sensitive through the NIR. InGaAs is sensitive through the SWIR, while amorphous silicon and vanadium oxide are among the materials used in microbolometer arrays sensitive in the MWIR and LWIR. Mercury cadmium telluride (mercad) is sensitive in the LWIR and lead salt sensors are sensitive in the FIR.
Differentiating pixel structures are beyond the scope of this article, however, suffice to say that each sensor type captures infrared light differently than its neighbor. What we’ll focus on is the vision applications served by each type of sensor and the ways to extract more meaningful data from each.
NIR, SWIR and Optics
One of the great differentiators in IR imaging rests with the absorption properties of optical glass. Silicon, used to create visible optics, absorbs light beyond the NIR and SWIR. Therefore, systems operating in the MWIR and LWIR use optics based on more-expensive germanium, which is anisotropic (sensitive to water), and must be coated to protect it from pitting and damage caused by humidity, fog, rain, etc. Germanium optics add significant cost to the imaging system.
InGaAs cameras in the NIR and SWIR work with less expensive glass optics, but their sensitivity to longer wavelengths also means that they respond to materials differently than visible cameras, according to Doug Malchow, Business Development for Commercial Products at SUI Goodrich Corporation (formerly Sensors Unlimited Inc.). Food processing applications use this ability to detect bruises below the surface of fruit, as do paint, thin film and coating operations for similar reasons. Recyclers also use NIR and SWIR to differentiate PET from PVC and PDM plastics, while bottlers use it to check glass containers coming out of a furnace. And there are many others.
In each case, the IR imaging system has to be spectrally calibrated to the application. As SUI’s Malchow suggests, one way to optimize a NIR or SWIR imaging system to an application is to use an imaging spectrometer to quickly divide the spectral band in question into narrow bands, and then measure the output from an IR camera to see which bands yield the most contrast for the application.
Malchow recommends paying particular attention to the lighting during your spectral calibration. ‘‘We want the light to penetrate into the bulk of the materials we are trying to
discriminate between,’‘ says Malchow. ‘‘The more the photons interact with the molecules of the substance, the more molecular absorbance will result. If light reflects just off the surface, the path length where light interacts with the substance is short and it has little effect on the reflected light. This is referred to as specular reflectance; it will be apparent when it occurs, as you will also observe mirror-like reflectance in the image. It shows up as extremely bright ‘hot’ spots or ‘glints’ of the light source and usually dominates the image. So our aim is to position the light sources at angles that do not reflect directly back into the camera. This can be quite challenging, especially if the surface being monitored is shiny and has many surfaces at a variety of angles. Using multiple light sources with diffusers to yield a general glow from many different points, can help overcome the effects of specular reflectance.’‘
Take Geometry into Consideration
Using IR light to penetrate an object’s surface complicates the path of reflected light, which then means that an image is dependent on more than just the surface finish or shape. The depth of the coating and distribution of material in the penetrated surface also affect the light ray paths, and therefore, the image acquired by the sensor.
Dr. Mohammed A. Omar, Assistant Professor of Mechanical Engineering at Clemson University-International Center for Automotive Research (CU-ICAR), illustrates the dependence of IR imaging on 3D geometries as it impacts thermal inertia, diffusion, and emissivity through a sample application that inspects a coated steel automobile fuel tank.
Curves and different coating thicknesses at the surface of the fuel tank resulted in poor image contrast when thermally inspecting these tanks. Visible light is even less helpful, depending on the coating used, etc. To filter out the effects of object geometry, Omar found his system was more effective at isolating pinhole defects if it normalized emissivity across the image, based on geometric contours and environmental thermal contributions.
The top left image is a raw thermal image. The middle image is a thermal image after traditional thresholding, and the third image shows the benefits of registering the thermal image to an emissivity coefficient and geometric contour map. Images compliments of CU-ICAR.
Security Visualization Takes Cues From Industrial Vision
Normalizing an image or set of images to a global coordinate system is something that’s been done in visible imaging for web scanning applications, high resolution/wide area applications in semiconductor, and vision-guided robotic applications for many years, but the technique is also changing how IR security applications perform.
Coming soon, Part II of this series, ‘‘Security and Machine Vision Merge in the Infrared’‘ will explore how the security industry is adopting and expanding machine vision techniques.
There are currently no comments for this article.
Leave a Comment:
All fields are required, but only your name and comment will be visible (email addresses are kept confidential). Comments are moderated and will not appear immediately. Please no link dropping, no keywords or domains as names; do not spam, and please do not advertise.