STIHL sought a new solution, one that would replace the human element with machine vision based on deep learning. The quality-assurance assessment would thus be automated to cut costs and save time. “When we began the process of considering a machine vision solution, every gasoline suction head was being checked by a human,” says Fromm, “however, the parts are very small and the error features are quite hard to detect, so we determined a need for deploying machine vision into the inspection process.” Slip rates are defined as instances wherein a bad part is erroneously classified as a good part; hit rates are accuracy results.
“We have been working with
Matrox® Imaging since 2016,” Fromm continues, “when STIHL forged a relationship with
Rauscher GmbH—a key Matrox Imaging and machine vision component provider in Germany—after meeting at a trade fair. We appreciate being able to work with a single supplier to get both hardware and software from one vendor, as that has been instrumental in getting our systems up and running quickly. It’s because of our positive history with Rauscher GmbH that STIHL sought the expertise of Matrox Imaging from development of this new system.”
Going deeper with deep learning
“The inspection process of each part involves looking at four distinct footbridges, and the machine processes 60 parts per minute. The inspection therefore occurs at a rate of 240 images per minute,” Fromm outlines. Conventional image processing tools had been used to evaluate the parts; deep learning functionality extends the field of image processing capabilities in instances where conventional image processing produces inconclusive results, due to high natural variability. “STIHL determined that rule-based image processing is not appropriate, because the component images vary too much and the error rate is too high, even at hit rates ranging from 80% to 95%,” Fromm concludes. “The new system would thus be required to yield fewer slips and result in a higher hit rate. Using Matrox Imaging’s classification steps yielded hit rates of 99.5% accuracy, a tremendous improvement.”
STIHL’s new vision system comprises
Matrox Design Assistant X vision software running on a Matrox 4Sight GPm
vision controller, selected because of the I/O capabilities, PROFINET® connections, and Power-over-Ethernet (PoE) support. The system also includes a PoE line-scan camera, a rotary table, an encoder, and ultra-high intensity line lights (LL230 Series) from Advanced illumination.
Development and deployment of STIHL’s new vision system brought together vision experts from STIHL’s team, members of Rauscher GmbH’s applications team, as well as several machine-vision experts from the Matrox Vision Squad.
Good, not good, and where the differences lie
Effective training of a neural network is not a trivial task; images must be adequate in number, appropriately labeled, and represent the expected application variations on a set-up that yields repeatable imaging conditions. With this in mind, the team at STIHL engaged Matrox Imaging’s vision experts to undertake the training of the convolutional neural network (CNN) on their behalf.
Fromm describes the collection of images as representing “a plastic part with fabric seam, photographed from the inside. In the images, only the footbridge itself contains important information, everything else is irrelevant. To prepare the dataset, therefore, each footbridge is extricated from the general image and sorted into folders classed as ‘good (IO)’ and ‘not good (NIO)’. The team at STIHL undertook the process of manually labeling 2,000 representative parts, each with four images, for a total dataset of 8,000 images. Without the guidance of Matrox Imaging’s engineering team, this level of intricacy would have been exceptionally challenging.”