Abstract
Products’ quality inspection is an important stage in every production route, in which the quality of the produced goods is estimated and compared with the desired specifications. With traditional inspection, the process rely on manual methods that generates various costs and large time consumption. On the contrary, today’s inspection systems that use modern techniques like computer vision, are more accurate and efficient. However, the amount of work needed to build a computer vision system based on classic techniques is relatively large, due to the issue of manually selecting and extracting features from digital images, which also produces labor costs for the system engineers.
In this research, we present an adopted approach based on convolutional neural networks to design a system for quality inspection with high level of accuracy and low cost. The system is designed using transfer learning to transfer layers from a previously trained model and a fully connected neural network to classify the product’s condition into healthy or damaged. Helical gears were used as the inspected object and three cameras with differing resolutions were used to evaluate the system with colored and grayscale images. Experimental results showed high accuracy levels with colored images and even higher accuracies with grayscale images at every resolution, emphasizing the ability to build an inspection system at low costs, ease of construction and automatic extraction of image features.
Copyright: Open Access authors retain the copyrights of their papers, and all open access articles are distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution and reproduction in any medium, provided that the original work is properly cited. The use of general descriptive names, trade names, trademarks, and so forth in this publication, even if not specifically identified, does not imply that these names are not protected by the relevant laws and regulations. While the advice and information in this journal are believed to be true and accurate on the date of its going to press, neither the authors, the editors, nor the publisher can accept any legal responsibility for any errors or omissions that may be made. The publisher makes no warranty, express or implied, with respect to the material contained herein.
(Received 5 October 2020; accepted 6 December 2020)