Member Login
 

Beginning Optics for Machine Vision
Part 1

Beginning Optics for Machine Vision
Part 1

Part of the AIA Certified Vision Professional-Basic program, Greg Hollows, Director of Machine Vision for Edmund Optics, teaches how to define the fundamental parameters of optical layout, how to balance a system's field of view, resolution working distance and depth of field.

Advanced Training

Subscribe for access to Advanced Certified Vision Professional Training

Click Here

Exam Locations

Locate a Certified Vision Professional exam location and test your vision training knowledge.

Click Here
Click to View Transcript I am great Paulos on the director of machine vision solutions and objects I have a degree in chemistry and physics for encouraging diversity in 14 years experience in the industry and will go ever basic objects for the CPP forces today hello and welcome to the fundamental set of objects for machine vision for the AIA CVP courses my needs very Paulos and learning over a variety of topics here today going into the air all of objects and object is one of the interesting areas that sometimes gets overlooked company machine vision system together on the reach that is its consider the sixth century part in many cases are people think it comes with the camera from just general personal use and set piece that comes with the camera technology over find today that was this is actually signal processing that please their links the lighting names probably are looking at as part of this course as well and the sensor technology to gather and using the right optics can go a long way to greatly improving the performance of certain get out of your system so what will it do for a start offer, things around fundamental parameters of the system and never remove into some more advanced topics that are to let us understand how to actually specify out components or seal whole variety of different things were a star of that things you really need to think about the most when you’re strange specify a system and there is certainly other differentiate things in the second part of the discussion removing the things having to do with measurement accuracy revolve around distortion and concert Hall tells Century City and will redefine here’s rents overwriting things relating back to resolution and were entirely two different if the resolution today to talk about when is the limiting resolution optical system can give you nothing in the middle section here talk about and he often resolution contrast to other and Everest talk about measurement accuracy the end in your seasons are subtle differences between them in your putting a system together and are some different concepts and different parameters digital in the way when you’re choosing your optical components and sensor technologies are instructed being worked her way through and ushering give you lots of images and practical examples help you understand the difference is that you can see in systems and separate marketing nomenclature from real-world technology and physics it’s involved in systems to jump right in receipt is targeted and that this is some visual all want today and love our goal is go through each one of these areas as we believe that each one of these resolution contrast distortion perspective variant feel all relate back to some level of image quality of the system is to give you the level of image quality you need for your particular application is all dependent on the requirements of you have see you have varying levels of the 20 things at play into it if you looking sense very brief and event a lot of movement to hear very little impact on things like the field and I have a lot to do this resolution contrast to make the applications that are immediately the traffic to the security area really tremendous amounts of depth of field see what’s going on at different positions that might be the most important one so we might be pushing employment and each one of these concepts to the image quality levels required for your application will read trying to his step through each one of these independently understand the things that push and pull on their and see how they affect the other things you will find a sometimes when you want higher and higher resolutions it sinister impact your ability in good depth of field in the system because the same lever you pull to get the resolution that you want is one attack upon the opposite way in terms of depth of field and these are some of the concepts that were trying to push through today so starting off always surfing an imaging system together we think from the optical effective there are certain fundamental parameters that we want to understand and Poland of the systems that were able to effectively choose the right optical components for we need a rare through each one of these separately will we start of the things they sensor sizes in the camera the working distance of the system the resolution were trying to hit the field of use several trying to see and all the things privacy over what we should choose concerns of the optics of the sense that require off to the other side here this is other sexual photographer Nicole team manager primary magnification to talk about that a little but this is actually telling us how the size of the sensor relates the field of you that really see any get some straightforward calculations to understand the type of resolution we can extract from the system kind of intuitive as I make my field of view with me on a gallon a look at bigger and bigger my ability to see fine details going to go down or set directly correlates this concept of being stationary lens has a nine application for the given field of user working at Cialis calculations were a few slides so let’s start off with the probably the most basic parameter to come up against your putting system together and this is called field of view this is the actual area that you want to see them as a couple things to think about here is you might have a very large object that has two areas of interest that are separated by a long distance you can constrain your system Salant field of view that can see both them with one camera work is there looking at some areas of those two areas of interest in free two systems may choose smaller fields of view the benefit of the latter’s incentive probably get you higher resolution and more capable results out of the system this year more detail in a given area taxi Hannah life the downside of using two systems for that but surely in your hands around waterway really need to see what’s the most important area of interest I need to extract from the system to do that effectively and well and it’s usually the one from a customer perspective you have readily available I need to look at an object of the given side whether this is remote or some buttons on the remote work hard or anything like that usually come ensure that an application knowing that detail in fact been understanding exactly which was usually important swing move follower working distance this is the distance from the object itself where the enemy and focus are best focus to the front face of the lens not to be confused with things like total track which be the distance from the object the back end of the system working distance can be in many applications not that shirt of a requirement many cases there is variable working distances that can be needed or can be utilized in the system so it’s not this overwriting shrinkable finding some systems that this connection be one that you really have to think about some things that come to mind immediately for me is final yet different illumination technologies that have to be employed at different angles here have to be very very large there any enough working distance name might have to actually be between the object and the camera system the lens system itself you have to accommodate for that when you’re actually choosing a lens to make sure you have the right working distance other cases are where you have to stand off due to some sort of environment that is very it could be very hot very caustic for the system you have a long working distance you got thumbs side is some sort of chamber seamier very very extended working distance in many cases so hereunto optics and an systems people on you very close to their object is not enough room to fit anything else since I’ve taken a small camera small lens and inevitably people in a look at a very large field of view this can be problematic from an optical perspective because the designs the oldest resolutions that you may want are hard to do with very short working distances a good rule of file you try one of the 2 to 5 times away from the object you looking at it at in relation to its size that gives you flexibility and room have a reasonable optical design and can hold certain performances you see on one expect again rule of thumb not an absolute register moving in very close to short working distances you start compromising things with lighting and resolution in the cost can go up very rehydrate quickly for optics seal to perform the levels you may desire budget in example this real-world example want to do a lot of images here today to get that point across overlooking here is the top of the and vegetable and this is the same Canon both pictures the application was a real-world one customers actually looking here at this printing that is on the top and the promised alignment sometimes go faster or slower and then press work should retch the printing that was on the can and were moving across the canvas top the system that they had that utilizes standard 12 mm lens very rating normal very popular full-length music use you a lot of variation for field of you with little changes in working distance out to its use very often changes in working distance out to its use very often the product you see here the lighting technique that was used user getting dark areas in portions of the yen Asian relations the flatter background with the printing is when the printing slide aside the print/going to the start areas and not being read the prominent review having was that these cancer then be rejecting 600 and read the dateline coding in the verify that it was actually what was the name that probably there catching the bad ones but good ones were getting tough side as well that’s not really what the desire was for the system by simply changing the working distance we change the rectangles with the way the light entered into the lens system we are only a much flatter illumination profile by just changing the working distance on lens system in exchange camera of the resolution the lading lighting identical in both the systems we get him much flatter profile in the setup so is this printing was being stretched or compressed or moved around me and everything to view Brett was a nice easy solution (this factory environment there was a tremendous amount of range from Europe versus system back for you easily get the desired results as you take a close-up look of that actual printing something thing to be noticed to describe that is not his of visual screen is a subtle details with scratches and watermarks and run the hand were not able to be tagged with all millimeters setup because of the lack resolution that could achieve for that field you 75 mm field of view the as is and 11th achieve higher levels of resolution for this application was in critical but it’s a good example of how making those variations when I have the opportunity to do that allows you get more detail these lenses are both from the same family of products a verse similar price ranges images the matter of changing some of the fundamental parameters to get better warming level and you see here human have to rip out the whole system and change it as just a slight modification will you allow the most important prayers are above it all day is resolution and that in this image shearers in most cases were to follow the remaining detail that can be resolved system area to find out later resolutions directly linked the contrast of bringing our bisphenol to say can I see one online offeror start there for the discussion work our way up to levels of way which is the more fun are you getting very real low changes in an object detail mean you pick those out another parameter that you want to define are you specify is something called You this is the ability to get good resolution with the desired resolution of all below best focus where I find some applications as I mentioned earlier suddenly a security application you’re probably and went conduct field of rural large range think of details about something that’s occurring at different distances away their other applications such as biomedical world things like confocal microscopy were very real low levels of depth fielder desired because of your China look at different slices of biological media the information right above right below it can actually distort or remove information going to see the level best focus in connection reduce sure image quality so having very low levels of depth fielder important applications like that so it all depends on the application but you really need to find depth field some sort of resolution requirement just saying I wanted good depth of field and system sometimes isn’t enough to love that little action you see the detailed field and resolve but I’m trying to do is really really important to see later how these things can affect color resolution and this is the best focus and happened determining the correct way last front the really here’s the sensor size and move one step forward sensor size to field of view in this concept of team manager primary made of acacia again discussed the Skinner ratio of the sensor size and how it on the object this lets us do a calculation of limiting resolution the camera can get off of the object now that’s not been this allows us the same were limitations are in the system at Deletable level before we start applying optics to better can it get some amount of degradation of the system is not automatic every time you another piece of the system it and has some amount of loss is about how well you can control will and get it back as close as you can to ensure limiting factor is your Nyquist limit can drive health system works one of the the mistakes of the flawlessly incumbent wages using the sensor to calculate your resolute will see is again the images later showing different lenses on the same object with the same sensor I knew wildly different resolutions and system in relation I used on it goes out the window if you have the rights a nice little summary here that’s very quickly going over field of view working distance resolution depth of field and sensor size information as you refer back to as necessary from Regis spoke about service are upturned harder with each of these a little bit more detail and saw how we determine field of view and how we can start choosing a lens that will send both get ourselves into canopy the sweet spot of of what my choices are and making sure I can get close to the product they need is a couple different ways you can do this we can use focal length the museum angular field of view lands were the main of nation out each one of those falling to be a different parameter on the specification sheet are axial inner wine and combinations of the same thing what you would find is simply spent more time on this and I’ll integrate that you see have a focal length in the angular field of your time together in the land always has a magnification when using there is always some ratio the field of view the you’re seeing to the sensor sizes you using and not always fact that the number nine occasion but I’ll let you see here’s how to use these different things that he constantly things in the system so here is one example that is the copper probably 70 to 80% of the applications are fairly straightforward when you look to lens specification sheet most lines that are used in vision are pixel length of nature and nature and no list out something there: angular field used that you shoot the angle that this lens is going to think had out to infinity: lenses do as a fico from the front end of the lens TrackBack angular field of view out and it’s some distance for any get this column the interviews circumcise you back in a new some fairly straightforward manner to calculate this information so usually if you have to go through the equation here is things that you analysis as an end-user is the field of view RFLP in the working distance you want to be in that usually choose things that you bring to the table in most occasions you might say I have variable working distance on there and it gives you more flexibility which is great… Safer second I have these two pieces of information what you can do that if she can go back and use this equation failed with the angular field of view needs the vehicle lands and then go through the datasheet they I need this land there is a strangler field of view or you have a lens I carefully… And as this angular field of view what’s my field of view working distance and again distance away and we go through this and do that math calculations that are here and critical part about this is since we have: we divide this: in have to get a nice right triangle which allows us do all the math for each free forward so are actual calculation be in the engine of half of the angular field of view equals half of the field of view divided by the working distances that have field of view the working distance and the half angle that allows us to run the equation either way to get things doubtfully what’s let’s use an example of your 12 mm lands on half-inch sensor has an angular field of youth 30.5° and were you working distance of 300 we can like funding everything that equation and pushing out the NL of the critical things when you use this and we can do this is remember at some point here you cut your field of view and half weeks after angle and half you have to back to back up to get your full fields are everything soon be off by a factor to when you buy a product of now here’s the one thing that we have your the bottom we actually do this from the design the design says the field of use can be hundred and 68 mm instead of the lens 62 ½ and they’re coming off of going on the front of the lens the actual optics for the work is being done is usually within the lenses and at some point around the principal plane of the lens nets on some units easily able to be extractor in QE specification sheet it’s much easier measure off the front housing the lens so these are basically the numbers at infinity as far away as we can get when you start moving up flow these numbers are common approximation to you want to leave yourself some space in some wiggle room to make the solvent company other effective server machine seeing think of as any to parameters and thought are when you need to choose a lens of some sort the other one for tackling field is actually much more straightforward this is using a magnification oh now this is where you get in the things usually you get a magnetic station given you specification sheet something like the microscope objective to ask five acts 1040 X whatever may be usually higher magnifications things like color centric lenses will usually have a magnification associated with as well usually down to something like made me want to spend you see the sensors is a ratios and formulas very easy very fast with field of view equals the sensor size divided by the peanut is your that is just the ratio things that were value working with here and usually you have the sensor that you’ve chosen for your system and your field of view this is a higher case and I’ll let you get very quickly a virtual team guidance associated with certain lens so those objectives and all you do light work of 19 occasion you won’t get an angular field of view with because they’re only doing one spot ceiling you on the line the angular field of view kind of is not really why error critical parameters of the system so was taken example in the camera in a half-inch sensor and we had a lens with a half acts primarily station so again sorry for field of view is equal to the horizontal on half-inch sensor with 6.4 mm the first thing you’ll notice about that is that 6.4 mm is not half an inch will discuss that in a few slides is for Portland keep in mind if you go with the actual data on the sensor, not the actual dimensions given divided by the Halifax we get a field of view of 12 waiting millimeters so basically the field of view is place the sensor size because the Halifax magnification weird see if we help one next magnification the sensor and field of view the exact same affects for billing bigger semi ratio is saying I’m basically doubling my sensor size to get my field of view now this is important is the salsa circuit on a map back pixel sizes on to the object is that you’re doing you’re looking for resolution systems thinking the field of view that you’re getting to happiness pixels on to the object itself nets what is critical things ensuring resolution feeling of fact that you see how this all comes invariably quickly determine what my limiting resolution system so we die mention about the half-inch sensor not having anything about that the half-inch look at array here nice here half an inch is 12.7 mm nothing about the sensors 12.7 mm anywhere we look at this half-inch that Asian your this goes back to a nomenclature was applied during today’s Nvidia cards user being used in your switching over to another sensor technology this allowed people actually lenses on the systems the right waiting it seems sort fields you the very beginning with your old tubes now with the new sensor technology so this is a nomenclature the fact that the half inch to 30 and shall 1 inch whatever these numbers are nothing to do with the actual dimensions the sensor in the senior very very confusing is very important to go back in and actually utilize what the nomenclature says the sensor sizes when you’re not doing anything with sensor in the system and it refueled you are an eight is really dicey when you some of these things it’s not listed on here but there are some that use fractions and decimals altogether the same time it’s hard to calculate to the size unities is very good to refer back to manufactured data sheets were the nomenclature specific that we show here Camillo figure out exactly how big the sensors are this is a word of caution when choosing optics lenses so think of the step further writing compared to sensors your the one in the upper right is the one third incentive this at 230 sensor and hear some things that are just to be aware of when you’re choosing lenses and imaging system so I look at one third inch sensor we might choose a lens playing on trawler calculations that creates an image circle to this day in the system as you see the entire sensor is is covered by that image circle back to work really really well my camera manufacturers going along knowing a longer sensor technology changes that are going to put a new sensor and that camera that might have the same resolution when we see something that happens your the can be problematic with my first thing therein notices for not getting images all the way in the corners this Sunday when you look at a lens datasheet is described as the maximum sensor coverage for that lens what’s happening years we get a darkening of the corners; vignetting this can be a problem if you’re choosing to do resolution only corners original and you can see the second here notice years that image circle now were getting much more at on that acre sensor nets underneath the mind hears that were in a C in different sizes field of view through Seymour with the lens to on the speaker sensor back to inspect the magnet are ratios of the sensor size for field of view is our sensor gets bigger that lens has a magnification to matter which when you’re using the field of using it bigger accordingly as you change your sensor sizes those wellness things that when sometimes can you manufactured between things a little bit you can see slight differences if you’re installing a system of long-term that’s now back to these dark in color usually has to occur here as we have to go out to a much larger image circle capability lens cover this corner so we don’t get vignetting now that my not seem like a big deal the one you have sensors that range anywhere from 83 mm on açai choose some sensors go off to 15 2030 mm on silent wide range of lenses the way to choose from they might have similar parameters but they are very very different capabilities in terms of the image covers of the have the other things that you’ll find shoes that not always everything at the it bigger hole the resolution actually only as a sensor can get difficult because it’s getting a larger larger area over the cross generally drives prices higher so be aware that you moved to different sensors that this can be a problem now the good part of this is going to Bayer sensors usually gets you better image quality then we discussed more the camera section but if you the same number of pixels the dear the pixel is a rule of the battery get better signal-to-noise capability of system and not allows us to actually get better image quality of consistency in this balancing act of the pixel small pixels in our out of a cynical the diffraction limit the little bit that says is an absolute limits the resolution I can get my pixel skip to small optically are probably not that Scotland is gotchas that you to be aware of in Congress and so after the event the fundamental parameters little moved indirectly and resolution at this point were still literally on here will we can calculate Amherst Meridian the black-and-white world but will we want to start understanding we talk or plug in cases with the limiting resolution and we can see in this example here we have two different objects both with the same size details on these these are squares and in the left-hand one example a you can see that there is a little bit of space between them are so close together on the object when they do image sharpened the system there actually on to adjacent pixels were we going to look at down the camera system were to see one object that is true pixels in size not to objects that are one pixel incest things can’t see the space between color and detail over here we have one that is there part a reaction see the empty space between and I conceive theoretically on all signal pattern that is going on there in the sun it in the article world we call line pair this allows us the the on off frequency of cycling that’s going on allows us to do some graphical information is safe Elwell at a certain frequency are lying here in millimeters that we can RC Cola detail will see that much later on registered on the back of your mind this first part of the discussion ballistic of the step further Mrs. Reagan the limiting resolution the system if I have enough rain camera Reyes thousand pixels on an inkjet or any thousand screen’s and I actually image that one Vetere with thousand pixels wide were covers the full thousand dollars I will see a line not individual spots what they need to be of the seas of the limiting resolution my system and it is not thousand I need VLC on off the limiting resolution I systems only 501st thing completely system and an alert is gotchas I bought this thousand or 1.3 megapixel on the sire of sorry 1300 pixel on the side 1.3 megapixel sensor and I think you break my feeling 1300 part of the rate of 1301st which can see 1300 things and that’s one of those catches in there when you’re actually getting resolution and understand imager in get out the system you’re actually getting a hassle and she with me that’s why in some cases there are more pixels adequately good will see later what can be a problem right now are living on black-and-white perfect you know the background is completely laid the foregrounds completely back every insert working at this level and work our way out some basic you to mind here are pretty straightforward frequency and line pairs per millimeter equals one divided by the spacing and millimeters of those pixel and a line pair singles two times the pixel guesses into the space reading start mapping in calculating what you can actually see you then example here so let’s say some the things we talked about will gather there’s want to different resolutions and in talk about an assistant picture vertical lines are object space resolution but image space resolution in receipts resolutions ranking accurately off my sensor I know a pixel size I know how many pixels I have and configure patch depth anything calculate but in most applications which are really interested in is my object space resolution I’ve got some objectives of some size Weathers is remote or something else and I was see some detail on this week under the table if I need to solve this this way highlighting in a related those two things was actually very straightforward object space resolution microns the was the image space resolution divided by the peak may remember the primary magnification is just our ratio are sensor size to the field of view Regency they’re going to look at it though is that the object space resolution be and line pairs per millimeter equals the feedback times image space resolution but the end of the day we want to get back to this wanted Senate in some details been brought is a long him him microns it could be in millimeters it could be in miles literate is killing our field of view again is our sensor side five by RP Mac so let’s walk through this and see what actually happens here you take a fictitious sensors out there but since pretty straightforward needs-based 10 math we an example here are the sensor sizes 10 x 10 number of pixels is thousand my thousand the number of line pairs on the sensor member we need to pixels to make a line pair we end up with 500 line pairs on the sensor allows us to bring the system up and 500 parts line pairs per millimeter this is what we in the optical world get very interested in its 500÷10 on the sensor consistently 50 line pairs per millimeter the image space resolution that were able to get is 20 µ it’s the inverse that 50 line pairs the other quick way to get to it is that you can tell right quickly these pixels or 10 µ size 10 mm chip of the thousand pixels on a line pair and resolution equals two times the pixel 10 µ each is 20 µ I said this part here lenses again there said later on we can actually show you how pair lenses was some sort of frequency of this very very important notice take that example on using real-world space and see how these things push back and forth larger expectations can be there is space resolution same sensor is 20 µ I have a primary magnification of two I’m good a be getting a field of view of half the size of my sensor Lindy magnifying my object will we’ll see more tell theoretically the object space resolution is the 20 µ divided by the to 10 µ that make sense to have my field of view and half Michigan high-resolution my resolution Shigella I can see 10 µ detail theoretically the field of view is the sensor size divided by the feedback 5 mm by flip that around and go the opposite way of an object space resolution on an object space resolution the plaintiff hundred not all line pairs per millimeter Steger the other radial Kenai on Halifax instead of going dampening my field of view smaller than doubled my field of human send me the half the size the sensor the odd a space resolution again is still 20 µ and a half axis and you give me an image space resolution of 40 µ in the system my field of view Senate double up to 20 from the sensor size since counterintuitive the field of view gets Bayer high-resolution is down my field the is Sarver resolution goes up this is the last thing that we can calculate in these systems there is many many more slide the ring discussing go through but this is the last in the correct you cannot delete in the imaging system with the optical portion of the reason for this is because optics can give you this exact number it’s going to be a level contrast associated with dizziness he relative levels of capability and I just can’t keep drilling this number down linguistic this one here the object space resolution is 10 µ you say need to see 10 times better than I need see one mic travel theoretically do the math you go back if they need see 10 times better than assigning 10×9 things like pattern available on the market by it doesn’t mean it is seam 1 µ details effectively as you it’s saying to start running into walls physics limits it you have to consider here and that is what the rest of her presentations any veteran all those things together understanding how to get past the street calculations is as simple as these calculations putting the imaging system together be very straightforward and easy refinements are subtleties in all of us so let’s start moving forward a bit certain looking out over cat a out of field of view and resolution and overall image quality what’s been happening with sensor technologies all these things come together for the longest time the better for the imaging Mauro always using 640x480 sensors very popular sensor size readily available is probably what we saw for the you know from 15 years ago to maybe five years ago is the head the reuse sensors and still use a lot to do goalies are seen this ramping up incident to megapixel five megapixel six 1629 all these are easy side that are out there with lots of imaging capability and that in the whole goal around that uses a dual we see here as well as longitudinal jitter taken a very have like this here and break it into smaller and smaller cars a lot more detail for its to C more object to the same level of detail but the critical thing with as you find is that a lot of optical components will rate this landlord resolution weren’t getting the performance totally maximized as we start going from 640x480 .3 megapixel two megapixel at the six times increase in resolution in the system very very popular sensor the tell at this point is a five megapixel sensor that the lessee 20 times jump in resolution what is happening there is the sensors and junk food chain last for five years is that the optics that worked well in this space are fairly well you are seen this incredible jump past the capabilities were you’re not getting a matchup with what the expectations out the sensor and not just because you weren’t seeing the limitation over on this side an analog sure look at right now and see how these things happen one of the real things that you do and make a recommendation for anybody on this if you have a lot of together to get some test targets this allows you to benchmark your system and is a variety of targets of the show today to help you understand how to use the resolution on in different directions effectively to get the desired results and is a very very popular ones the U.S. Air Force 1951 target and what it lets us do is see different frequencies both the horizontal and vertical annual notice something is very nicer spirals and towards the middle you iron higher frequencies no the benefit of that is if you can easily zoom lens system be able to zoom in and zoom out without having to move everything around so the beneficial one of the drawbacks of the start as you can see high-resolution in the center in the corner at the same time and your Internet limit its limits on but is fairly straightforward to use and read off the information is needed and use of software work on it to see how well your contrast and resolution rolling up one other comment on that you should always use software to tell you what your resolution and contrast levels are our brains of these wonderful image processes rustle of the worst parts about imaging is that our brains can see things that maybe aren’t there only see a repetitive pattern is going on and on and on our brain knows the filling the gap’s that’s why are those of us that have have glasses and things like that were still read things and maybe the resolution releasing there because our brains know the general sheets of words and numbers that we had an hour and every cylinders – you never want your eyes your brain pulling tricks on you saying that the systems work in this level causes software meets the sea always wanted to overload a look at here those were to start off with using the same lens on two different sensors this lens is not limited by either wanted the sensors below are is see here is this is a 640x480 sensor in this is on .3 megapixel sensor when the sensor came on the market here the nice thing about it is that all the pixels have been cut in half from the size that were here essentially cylinders sensor and the sensor with the same size the importance the sensors being the same size you are about the eMac or primary magnification using the same lens on both so I’m getting the same field you one of son getting the same theoretical resolution from the optics overseeing our email pixels from one field from the same filter you have more pixels on the site in the sun pretty intuitively what you would expect I can see everything on this target better than this one here makes a lot of sense I got twice as many pixels the horizontal and vertical here and I have here I would expect to see more detail now you can see is then nice you and these artists to deliver the #just the lines you see that the town others a variety of things that are going on here and take a brief aside on one of them in a moment but you can see here first off the horizontal vertical resolutions on this charter are not holding the same way I can see the horizontal dotted vertical because the horizontal lines are represented by the the spacing on the vertical on the vertical lines are the horizontal spacing of these pixels you’ll notice here there all blurring together as opposed to hear that’s because this camera was analog cameras this is an older set up this is now on camera design for North America analog cameras in North America use rectangular pixels so of the delightful things about them is the difference spatial resolution they’ve different dimensions to the pixel in one direction versus the other that creates problems that if you use something had a rectangular pixel for some reason which they still do exist are on the market if you calculate your resolution things in this direction and you need to see a detail on this direction have a problem security blurt out to something keep in mind derivational knows that all these either both color cameras and color artifacts here that are induced into the system loss color cameras that are single-chip cameras will have color artifacts to them when you black-and-white imaging the hit the edge is the way of going from white to black in color interpolation algorithms struggle a little bit and you’ll get this color artifacts in the system so keep that in mind and it also reduces your resolution a little bit for other reasons out of we discussed camera sections of this this project so we can see the resolution gets better here here that makes a lot of sense when we want to do though is taken a step further is that is a system where the land was not the limiting factor the sensor in both cases was take the surface of further have the same resolution cameras who megapixel camera on both sides never looking at this generic UPS label while we want to look at your window close up of the upper corners with each one of these to see how well I can actually get the resolution I want out of the system on these on Tuesday May the nature center of your own the top as we zoom in on look at them you’ll see a distinct difference in the image quality that is here now viewer: lenses that reuse and they were both the same focal length producing the same field of view at the same working distance the same king mag everything else was controlled lighting was identical it’s two different designs on these products what is intriguing about this is the image on the left is listed as a megapixel lens on the market the slot is not the expectation would be is that if you are choosing a megapixel lens that you get something are hold up to the resolution is here the fact is that the resolutions lines does theoretically hold up to that camera or getting into his contrast levels are contrast reproduction a given resolution if I start saying I’ve got 5 to 10% contrast over here and 30 or 40% contrast overhears him resolution I’ll see the detail that are is a detail is inhere in that you have a different level of reproduction of that same level of resolution detail and that’s where things start converging we talk about resolution you have to talk about out resolution and contrast together at the same time we start to really struggle now there’s a variety other things here to think about now both of these could be read by most software packages to determine where the sponsoring a fairly large objects the lessee between and is and always is critical black there is a lower level reliability on this side this sign printing system together when you get down to the end of the day you the difference one of the drivers here that you can say that I left out is false the difference in cost between these two system, putting together the battle hundred dollars on maybe basically a $15,000 system so you get this sort of level reliability and repeatability this can get much better value for your system to do with the right way the hardest part about this though is he looked at the to D’s sheets next to each other and be very very hard to tell apart differences between the because all the general specifications dimensions angular field of view distortion resolution all look very similar the same and that’s for you to get back to when you strain a look at higher resolution systems especially the with any system should go back and talk to your optical providers about how well it actually performs at those levels not just nominally we can all get even nominal information vacuum when it don’t in the original finds in some cases they won’t fall off as well and get large amounts of variation image quality due to lack of tolerance control and that’s Scott of the the scene inhere in optics he can really get you especially sugar higher and higher resolutions says have the kickoff therefore resolution and contrast to gather full it is going to more detail and understand what’s going on vertigo back to remain charge your some talked about resolution resolving the contrast while we want to do is tidy things two things together for something called MTF modulation transfer function remembering that acronym is not really as important is understanding how’s at the end of the day that’s really what we want to try and get you to hear the other part of it is is that you can ask for; MTF curve on the land and were walking towards here is understanding how resolution and contrast work together in the field actually yet some piece of theater that you can utilize to do that comparison essay is my lens and or perform as well as I expect to Carolina an overall settlement of all the things in fact it every season examples here were wildly different results from two lenses by using two different positions you the performances the flip and that’s one of those other things to keep in mind too that depends on how lenses are designed with a were designed for health drive a lot of the foreman’s capability so let’s go back to more simplified example and understand contrast here for a little it to the extent of the giving of the real-world example this is looking at gel capsules these are for pain medication popular things everything from you had a analog are looking at is these four pills in this scenario they are red green green and red and without them under different filtering techniques now the real-world application was here as you look at this you can see in the grayscale levels are is hard to visualize stitching actually see from the the line profile the wrong across here in this area you can see the subtle difference between the Reds and the green hills that about 20 grayscale levels which is not unreasonable to do for some decision-making in and it vision system but the way this was said is was on a factor and the the the departure being looked at and the system is working fairly well and every day the boss of the men walk over and stand over the line all the sentiments are failing everything in front of the boss’s way these things always work the view that they the headman shows up in the other hit woman shows up and all of a sudden whatever you’re working on doesn’t work the way you want it soon it’s like this is the first time it’s happened plane that person would leave in everyday they come back is same problems in medicine why does this thing work I only ever see it not working with they came down to understand whether there reveals no room was in the blue bunny suits the boss would come in and put on the white folks he didn’t get all up in the suit and you come over in the subtle amount of reflection of lighting off of his wife that he was wearing was enough to move all these threshold levels off for what they’re looking at his assistant and have a high enough level contrast if there’s still some ways to get around that obviously an enclosure around the system to get rid and delight all those other obvious things but for this example to some innovative will be domiciled just a simple filtering the system to greatly increase the contrast levels ever being shown to eliminate those things as well so variations in the light over time also would not affect that for what they were doing for control inside the system for this is kind of that being world the object size area Republican contrast being a problem don’t take that into account any sees big separations here guarantee much better repeatability in the long run a simple fix in a system like this beyond putting enclosure

Having $30 filter to make everything work so using contrast the big level and how is Argentina finer finer details getting an image free get much better level of contrast in this case of fingerprint analysis for application be only getting good read on trying to find that person is the bad guy is really really important in the contrast level of been fine detail converter important as well will C has gained back his percentage of contrast that is in the system in Hollywood is that separation between the maximum and minimum levels of praying units in the system very rarely if ever reconvene of true black-and-white scenario and accuracy you can’t do because if you get to inch through white side of the scenario in imaging system you have room the sensor had too many photons to it does you’re actually not any accurate information around your changes when you below was sensor generally you hear going to end up getting overflow those electrons of photons in and you knocking it detail that you want to systems if you go to absolute light that the problem in the camera section now be discussed in more detail see this some level. The top is very hard fruit produce perfect line the bottom and the inherent noises in the sensor anyway is not to let you get some something else to keep in mind’s real weapons black and white world in our goals from the optics perspective is to get you is higher level of three transitions possible in their from this theoretical edges of what you can get out of the sensor system the closest we could reproduce that his illness kept NUC at some points here the laws of physics, and play sometimes this can be the quality design comes in the plane sometimes it’s the manufacturing in many cases as all three come together there see how far it deviate from these gray levels from the perfect black-and-white really trying to look for most applications some things to keep in mind is that resolution contrast a very close a link the other thing is typical imaging system the 20% a little high usually have upwards of 10% Norway’s in the camera’s in the sensors there’s a lot of reasons for that but if you ever go into a lowlight image I’m looking it up close you’ll see the flickering of the pixels not noise it’s there so any given time that Flickr you can have as much as 10% variation from pixel to pixel if you keep snapping images and looking at them and that can be a problem so when we start getting out the limiting level rice say adding given resolution my pixel give me 10% contrast that any given time the camera. Flickr 10% new ways into the system I can lose it detail and that’s about image placement and accuracy that’s also about looking for very recent small details on object thinking you learn now and will see that if you are looking at test targeted in a camera or lens and had Amberson Congress tools that blurring when you look at that are in that area knowing that means tested on another piece of equipment actually have 10% contrast so the phase you soon this is the beautiful cardigan about being a human being are eyes can see one or less than 1% contrast without really a problem so one of the the things that goes off and I find is that while I can see that defect why can’t camera see well the first thing is you can’t know what you’re looking for the second part is your eyes can pick up details of the camera can’t necessarily see the VCR through a microscope just airily in all likelihood a high night station system of the camera on it will struggle if you can see it detail and zoning were you have to constantly shift and move the light around and you can see it is barely camera systems probably finished probably one of the best in us and then keep in mind as far as choosing always correctly Celeste talk about what what kind of goes on as you go through a lens or an aperture of any type and a lens does out of Richard to it that it’s usually characterized by the Avenue stop in the iris setting boldly goes your there so we have these objects spots that are here the real world the real hard edges to them but in this scenario were going through any advantage over is some blurred functions this Gaussian profile this year as you see in the image here is Braden the middle and it gets softer the edges’s in a matter while we do organic assault – no matter how good the optics are to what were doing as optics move closer and closer together eventually these things are in a blur now go back to the example that we had earlier that you aren’t boxes on the paper they were so close together the image onto adjacent pixels and we had separate them enough – you detail in this scenario here let us assume near the yacht is the one pixel in size or be calculated but when it goes through the optics for whatever reason whether see aperture setting of the quality of the of the component name blur together enough that I blur those two object line do that pixel that was between I could end up now an object looks three pixels in size and not feel the CD tell this is researching away from what I connect to calculate with my camera system is by limiting resolution and actually when embittered alien out of the system because the optical performance of the signal processing portion on the that sensor everything I can do that sensor up front on do it is in and make the image look better in the software have an easier time going through and making things work out to get the details that we want to see and that’s one of the critical things were in a certain a looking at all the different things a go into thriving this sort of problem for getting you away from as far as that extra blurring goes social it is an example Searcy’s are moving towards this concept of NTS we are these large black-and-white lines needs narrow ones as you get bigger spacing a lot of cases the raw between them can get very great deed and this is like 90% contrast but if you notice your response invariably black white to black in the middle white VA in the shade of gray in their nets separation still 90% contrast getting roll off to the fire finer frequencies that starts dropping down to get to this 20% contrast lowlands alluvial notices is fairly accurate we go from basic black and white to subtle shades of gray and in 20% contrast might sound like a lot but it’s not as much as you Nestlé thinking years since his start getting hard-to-reach track things that level since really important goes to higher levels as possible in the system so you step through your now reserved like in other targeted; Iraqi ruling and I say but this target is this the same frequency across the entire field of view

We talked about that U.S. Air Force target were is only resolutions and discrete spots we can see the center in the corner the same time this one says if I is specific resolution I wanted you I can see it everywhere wants and get the whole field of view to see at one time to get the resolution contrast levels that everyone to see what readers world walk-through couple different lenses and see how they perform across the field of use that’s where the other things we talked about yet resolution when you buy these cameras of these high-resolution sensors you want to get that resolution everywhere else MBL it’s in fact to understand it’s going on there is relatively important as we go through this year. See some subtle things that go on matter of Citrix for the classic ways of doing lens designed to control for price-performance in a couple of other things were and see some of those, here’s you go through them so reduce read and analyze the system in the center and easily close-ups the bottom middle and then the corner of the system now what we did here this week kicked a frequency that was for pixels on for pixels offer these lines okay so we’re seeing a fairly low resolution were not approaching the limiting resolution the camera reason for pixels for black for pixel Soloway if you have a chance that he see this in the industry PowerPoint you won’t will click on the enlargements thank you Microsoft for the enhancements you do to make images look cleaner all those things like that the blowout but the actual mathematical archivist on your the wall images is for on end four off and will do his will notice some of the fact the the peaks and valleys Grafton and will also look at the contrast levels for each one vertices image seen artifacts come out to look at three different lenses never utilize sewing this one here with a 404 off or is see is very about 59% contrast in the middle 56 and the bottom middle and 62 in the corner the bouncy on this was evidently a little bit more for the corner and what will see here though is was see a shift from the center which is purple all the light purple which is the bottom the always corner intensity level is a couple of things ago on their as you start going out across the sensor to be sinkhole roll off the lighting levels not can it be maintained across the sensor and last year using a very special tablelands in there some other things at the optical designs designers can do to help bring up resolution or reduce them talk about that little bit but understanding a different light levels across your system if you have low levels of contrasting go back to software usually your thresholding everything out of above a certain level below a certain level see the that you can change it across your field of view but if you while changes in lighting and low levels of contrast it’s very hard to get those thresholds the whole longwave across this is just taking out certain levels of contrast above and below by get while changes in a delicate overlap you can have problems yourself something to keep in mind is the contrast levels to us about this is lines 1 look at lens steering any scroll same focal length all the same f-number all the same fields of view we see in this lens of those same specifications were down 4742 and 37 going across the sensor Sony also limited draw your attention to know is you look at the flatness of the lines were we actually look across Rio Grande River this is all the same target all the way across were actually seeing how parallel is lines are or distortion in the system over a talk with distortion later you can start to see his going through here that were getting distortion in the system of these lines are being been more more one of the interesting things between these two lenses just as a side note is the first one set is as 3% distortion this one says it has one even though the lines are Schrödinger on the first one some subtle things in a way that you asked specified products and will get into later the concert seeing some these effects there that you might not be predicting the final we want to look at here with 52% contrast in the middle 22% contrast in the bottom middle and 36% corner very reenergizing thing that’s going on here one we go through a a lens system and we try and actually put all the raise in the place and control for them so that things settle her so this is my lens and this is my object nonliving looking at points here and here when I have rays that come off of here and they’re going back on my sensor in the censor the middle of my system fighting a close-up of this spot here and say this is a pixel while making yet is raise the do this not actually forming a solid image in the system and accurately one of the problems that we have that were not actually getting this right up here all links signal it’s going to spot the smaller shot might not actually be that pixel but it ends up doing is making it to the next pixel and this can cause blurring in the system for getting detail that was supposed to be in one spot in the wrong on what I can do with that is I can artificially put in a picture to block out that right so what I again here is I’ve gotten a resolution to the right spot haven’t had a reduction resolution necessarily below I have done as I reduce reasoning you back to the backend system negative nation dropped system that’s what were actually seeing here in this scenario the resolution is dropping you can see it by going to the bottom middle to this one but it comes back up in the corner what is occurred is we started clipping in the corner EMI resolution backup I’m getting rid of the raise are going to the wrong location but a hand you literally enough on this scenario here in the middle and start see that degradation in the middle of the image before I get the corner slicing/very important to look at a variety you spot damage to guarantee resolutions is a great example of where we get good resolution and contrast in the middle Deason in the corner and we actually drop here and if this is too low for what you’re trying to do new statistics here here back and be a big problem so utilize yet hardened or some sort correctly to be only uses system effectively can be highly important to make your system works the we wanted to this something called on selected vignetting and what this allows us to do is vignetting is the clipping of rays of the get out to the corner the system by selectively clipping them off control my resolution of without allows new do is maybe go with a more sensitive design or something have less design elements and it to get my car stereo but I’m making a trade-off in terms of overall illumination profile in this case resolution one portion the image to get the job and I said balancing the air in his and always seen in the system unless you can dove these high levels of detail was resolution but with dispersed only cameras on interim the resolution requirements this can be something you really have to pay attention to it also can be problematic if not dress correctly here’s the example of looking at all three of those at one time and seeing all the different details and you can start in the subtle differences that you see an assistant going back to the last one here’s the difference between 22% contrast and 36 that is a big difference that looks like much more than going from 52 but you can see how you need at least one easy and start any problematic with details you one scene again this is a fairly low-frequency visit one for the limiting resolution of your camera system imagine let’s interview like at that limiting resolution at this point on this is the end of our want all of our hearts today camera to pick it back up with parts you restart moving and taxi getting into and see if curves

Search AIA:


Browse by Products:


Browse by Company Type: