- This article is filed under:
- » View All
Machine Vision Application Analysis and Implementation – Part 4
by Nello Zuech, Contributing Editor - AIA Posted 02/27/2001
This is the fourth of a series of articles designed to provide the framework for a successful machine vision system installation. The process described is targeted at companies planning the adoption of a machine vision system for the first time or that have a unique application that no one has previously attempted to implement.
As observed in Part 1, today one can find many application-specific machine vision systems for somewhat generic applications in many manufacturing industries. Purchasing these 'off-the-shelf' solutions poses little risk to any first time buyer. In some cases, one can find application-specific software developed by suppliers of general-purpose machine vision systems, imaging board/frame grabber suppliers, software suppliers or merchant system integrators. While these are not turnkey packages, the vision experience is itself less risky. Examples of these packages include: alignment, OCR/OCV, LCD/LED inspection, BGA inspection, etc.
Even less risky are the turnkey machine vision systems that are industry specific; e.g. bareboard or assembled board inspection for the electronic industry, overlay registration/critical dimension inspection for the semiconductor industry, various industry specific web scanners, systems targeted at food sorting, etc. Virtually every manufacturing industry has these systems, many of which can be identified through the resources of the Automated Imaging Association.
Where these 'solutions' are not the answer and a machine vision application has been identified, success requires proceeding systematically and not treating the purchase as if one is purchasing a commodity item. It is not sufficient to send good and bad parts to various vendors and ask them if they can tell the difference.
Table 1 - Application Analysis and Implementation
|Know your company|
|Develop a team|
|Develop a project profile|
|Develop a specification|
|Get information on machine vision|
|Determine project responsibility|
|Write a project plan|
|Issue request for proposal|
|Conduct vendor conference|
|Conduct vendor site visits|
|Issue purchase order|
|Monitor project progress|
|Conduct systematic buy-off|
The above table depicts the process that should be used as one proceeds with the deployment of a machine vision system that is uniquely defined for a company. In Part 2 we covered a process for how to assess an application's requirements with ideas of what should be included in a functional specification. In Part 3 we covered how to get information on machine vision, defining project responsibility, writing a project plan, completing a bid document and conducting vendor conference. In this article we are going to cover the topics in bold.
Upon receipt and review of proposals, where necessary, clarification of details, especially disparities, should be obtained from the vendors. A technical evaluation should be conducted to narrow the field to two, possibly three, vendors. The evaluation should include an assessment of the feasibility of the proposed approach. This should include an assessment that the application and the environment are understood.
From the perspective of machine vision equipment, the following are considerations:
- Is the system's resolution adequate?
- Are the hardware and software suitable to conduct the analysis required?
- Is processing speed adequate for throughput?
- How is application developed?
- Are the principles behind the algorithms consistent with the requirements of the application? Do they make sense for the application?
- How robust are the proposed algorithms?
- What is involved in changeover from batch to batch?
- Train by showing
- Panel buttons
- Pendant (hand-held terminal)
- Light pen, touch screen, or cursor menu
- CRT terminal, off-line programming
- Language knowledge (C++, Visual Basic, etc.), Is it transparent?
- How does operator interface? How does engineering interface? How does maintenance interface?
- Panel buttons
- Light pen or touch screen
- CRT terminal or cursor commands
- Is the system secure? How is access restricted?
- Application program storage, provision for data storage in event of power outage
- Adequacy of interfaces, I/0, etc.
- System diagnostics
- Model changeover times
In addition to cost and delivery, other considerations in the vendor evaluation should revolve around acceptance testing and policies: training, maintenance, spare parts, field service, warranty, and documentation. What terms and conditions for acceptance should a vendor be prepared to adhere to? Demonstration and prove out testing at the vendor facility? At plant site? Off-line? On-line? The actual acceptance test plan should be mutually developed and should include as comprehensive a test as possible, including a way of addressing variables such as part positioning, lighting, and environmental variations.
Who is responsible for installation? Are there specific maintenance requirements - hardware, software? What level of sophistication is required to replace lights, adjust cameras, calibrate, reconfigure I/0, or do basic troubleshooting? How much training is included with the purchase order and at what level? What material is covered in the training? How much is hands-on? How long is the training? Where will it be conducted? How often and what type of training classes are available? At what cost for additional training?
What spares should be inventoried, even delivered, with the system? What are the costs? Availability? Is there a basis for meantime between failures? Meantime to repair? How 'local' is field service? Spares? Is 24-hr service provided? What are the normal service response times? Service costs? Maintenance contracts available? What is the standard warranty? (Parts and labor?) Is an extended warranty available? At what cost and terms?
What documentation is provided? What are the application program source codes? Schematics? What is the quality of the documentation and how complete is it? How many copies are provided with the system? How application-specific is the documentation? How well is the software documented within the vendor's facility?
The company itself should be evaluated. What is the company's overall experience? How many systems have they installed? In your industry? Related to the specific application? What is the experience of staff in terms of installations and application? Company data should include the following: time in business, time in machine vision business, number of people by skills, and financial situation.
A decision matrix (Table 1 ) is a useful tool in evaluating proposals by reducing subjectivity from the decision-making process. A typical matrix lists the vendors across the top, and the criteria are in the left column. The criteria include pricing, system properties, and vendor characteristics as determined from the proposals. On a scale of 0-10, each criterion is rated as to its relative importance, 0 being the least important and 10 the most important. These relative weights for each of the factors are independent of the vendors' performance and should be established even before the proposals are received.
In the columns under each vendor, on a scale of 1-10, evaluate how the performance to the criteria relates to the need or to each other, whichever is appropriate. Multiply this value by the criteria weight and place this value in the appropriate column.
Sum the scores of the criteria for each vendor and subtotal the results.
These subtotals give an indication of the relative capabilities of the vendors evaluated. It does not tell whether the vendor can do the specific application.
The decision matrix should include specification criteria. In the case of wants, the rating of 1-10 can be assigned in accordance with how closely a vendor satisfies the want. A subtotal associated with nonspecification criteria and wants should be determined as the addition for each vendor of all of the factored ratings in the column.
Below the subtotal, all specification criteria that are needs should be listed. In these cases the issue is not degree of responsiveness but rather yes-no or can-cannot perform factors. The ratings, therefore, should be either 0 or I (for can-yes answers). Each of these factors would then be multiplied by the subtotal. If a vendor cannot satisfy a need specification and receives a 0, their total score will be 0, and the vendor should be disqualified from the application.
Technology criteria may also be included in the decision matrix. A value might be assigned to binary, intensity-contrast, or gray scale-edge systems, correlation, geometric, etc. There are many technologies that can be quoted. Some may just do the job; some may be an overkill but may be more flexible (a consideration possibly for future requirements), and some may be just perfect for the requirement at hand. Most often the technology should be rated relative to the specific requirements.
Having concluded this procedure, the next step is to assign a confidence factor associated with the confidence that the vendor can meet each of the specification needs and perform as specified. Confidence may be dictated by the extent of experience the vendor indicates in the proposal, especially relevant experience. This confidence is then multiplied by the previous subtotal to achieve a figure of merit for each vendor for the application being quoted. This figure should lead to one to two or three vendors that should be visited.
Table 1 is an example of a decision matrix for the evaluation of three vendor offerings. The first part evaluates the company, policies, and so on, generally. The second part is an example of the technical aspects that could be important. These will vary depending on the application. Needs and wants are evaluated differently. Needs must be satisfied so they either have a 0 or a 1 weighting. This is simply multiplied by the cumulative score so far. The confidence factor is based on the cumulative 'gut feeling' for a vendor based on dealings with representatives, thoroughness of response, and so on. In our example, the result of this analysis would lead us to favor vendor A with a contract.
Before this is cast in concrete, however, it would be in order to review with vendor B why it was not responsive to the need to satisfy our escape rate criteria (in this example). Similarly, where vendors A and B do not compare favorably, it is appropriate to review those issues with the respective vendors to make certain there were no inadvertent oversights in the preparation of the proposals. In any event, using such a tool will reduce the subjectivity associated with evaluating bids and improve the probability of selecting the most qualified vendor.
Vendor Site Visits
These visits provide an opportunity to view any concept-proving demonstrations possible on existing equipment. In addition, these visits allow one to assess the following about a vendor:
- Technical resources (optics, television, computer, mechanical)
- Technology, design, philosophy, product line breadth
- Understanding of application
- Capital and human resources to support and service installation
- Physical facilities
- Business philosophy with respect to
- Installation support
- Project schedules and possible conflicts with other projects
- Financial stability and staying power
- Review experience and obtain references associated with similar installations
- Quality of work
- Knowledge of business
- Review typical documentation
- Quality control practices, product bum-in procedures, etc.
Another decision matrix might be in order to evaluate systematically the vendors based on the site visit. From this analysis a vendor of choice should be determined. At that time the references should be contacted. If possible, a visit should be scheduled to the installation site. In any event it is important to assess the reference's opinion of the vendor:
- Quality of work
- Ability to meet schedules
The leading question should be: If they had to do it all over, what would they do differently?
The final consideration with respect to a vendor revolves around the number of different machine vision systems that may ultimately be installed at a facility. Will they come from so many different vendors that the facility will experience difficulty servicing and maintaining spares?
Having made a decision on a vendor, it is now possible to fully assess the project's cost and to conduct a return-on-investment (ROI) analysis. An example of such an analysis can be found in an earlier article posted on machinevisiononline.org.
Issuing Purchase Orders
A contract should be issued that includes all the details associated with the project reviewed in the RFP and should include a 'buy-off ' procedure.
A payment schedule should be considered with a reasonable deposit up front (up to 40%) to cover application engineering costs that are unique to the project. Payment based on well-defined milestones could be a reasonable alternative. However, enough should be retained to make sure the vendor's interest is sustained so the project is completed in a timely manner. For example, a comprehensive concept-proving demonstration could be the first milestone.
Where an application solution is not 'off the shelf,' a phased study plan or development contract should be considered. In this case, one should avoid developing a new technique or at least be aware of the risks. It should be understood that there is a distinction between 'knowledge' and 'application.' Moving knowledge into applications involves risk that both the buyer and seller must understand. In other words, phased procurements are appropriate in these cases.
If dealing with proven techniques, however, a more conventional contract is appropriate. In this case the responsibilities of both the seller and buyer should be spelled out, especially with respect to installation and support: training, service, etc.
Monitor Project Progress
Do not just anticipate issuing a purchase order with specifications. One should be prepared for sustained guidance, recognizing that the adoption of new technology requires a 'partnership,' a cooperative arrangement between vendor and user. Assure that project benchmarks are being met by vendor site visits and project reports. Establish a realistic schedule. See that the vendor assigns someone with project responsibility. Avoid scheduling invention. In a vendor, bigger is not necessarily better. Responsiveness is a better criterion for vendor selection. Check vendor references.
Make certain both user and vendors understand the differences associated with the requirement - if development is required or application can be satisfied by a standard product. Similarly, the extent of the custom content or applications engineering should be understood.
Where risk is involved, consider a phased procurement cycle with the first-phase concept proving demonstrations and, subsequently, breadboard and first-piece demonstrations.
Avoid 'creeping' expectations as the project evolves, and when they occur, recognize the impact on the vendor in terms of design changes and schedule.
Paper has been adapted from Chapter of the same title in book titled 'Understanding and Applying Machine Vision,' published by Marcel-Dekker, January 2000.
There are currently no comments for this article.
Leave a Comment:
All fields are required, but only your name and comment will be visible (email addresses are kept confidential). Comments are moderated and will not appear immediately. Please no link dropping, no keywords or domains as names; do not spam, and please do not advertise.