sensors Article A Bevel Gear Quality Inspection System Based on Multi-Camera Vision Technology Ruiling Liu *, Dexing Zhong, Hongqiang Lyu and Jiuqiang Han School of Electronic and Information Engineering, Xi’an Jiaotong University, Xi’an 710049, China; bell@xjtu.edu.cn (D.Z.); hongqianglv@xjtu.edu.cn (H.L.); jqhan@xjtu.edu.cn (J.H.) * Correspondence: meggie@xjtu.edu.cn; Tel.: +86-29-8266-8665 (ext. 175) Academic Editor: Gonzalo Pajares Martinsanz Received: 6 July 2016; Accepted: 23 August 2016; Published: 25 August 2016 Abstract: Surface defect detection and dimension measurement of automotive bevel gears by manual inspection are costly, inefficient, low speed and low accuracy. In order to solve these problems, a synthetic bevel gear quality inspection system based on multi-camera vision technology is developed. The system can detect surface defects and measure gear dimensions simultaneously. Three efficient algorithms named Neighborhood Average Difference (NAD), Circle Approximation Method (CAM) and Fast Rotation-Position (FRP) are proposed. The system can detect knock damage, cracks, scratches, dents, gibbosity or repeated cutting of the spline, etc. The smallest detectable defect is 0.4 mm × 0.4 mm and the precision of dimension measurement is about 40–50 µm. One inspection process takes no more than 1.3 s. Both precision and speed meet the requirements of real-time online inspection in bevel gear production. Keywords: multi-camera vision; bevel gear; defect detection; dimension measurement 1. Introduction The bevel gear, also called the automobile shift gear, is an important part in an automobile transmission. Its quality directly affects the transmission system and the running state of the whole vehicle [1]. According to statistics, about one third of automobile chassis faults are transmission faults, in which gear problems account for the largest proportion [2]. Although the bevel gear’s design, material and manufacturing technique are quite mature and standard, in today’s high-speed production lines there are still inevitably defects such as dimension deviations, surface scratches, crushing, dents or insufficient filling, etc. Defective gears will not only reduce the performance of the car, but also cause security risks and potentially huge losses to manufacturers [3]. Figure 1a shows some kinds of bevel gears. The shape and structure of bevel gears are complicated and the types of defects various, so bevel gear quality inspection is very difficult. At present it is mainly completed manually, which is tedious and inefficient. It is hard to guarantee the quality stability and consistency. Recently Osaka Seimitsu Precision Machinery Corporation in Japan developed a high precision gear measuring instrument as shown in Figure 1b, but the inspection speed is very slow and cannot meet the requirements of high-speed production lines. There has been gear defect detection technology based on machine vision, but it’s mainly used in the inspection of broken teeth and flat surfaces of general gears [4]. There is no precedent research for bevel gears with complicated shape and structure [5]. Because of the multiple measurement tasks and diverse defects, the traditional single vision system cannot obtain enough information. To measure the multiple dimensions quickly and accurately, to detect and classify the defects completely and precisely, a more efficient vision system is needed. Machine vision is such an advanced quality inspection technology which uses intelligent cameras and computers to detect, identify, judge and measure objects [6–14]. To solve the above problems, Sensors 2016, 16, 1364; doi:10.3390/s16091364 www.mdpi.com/journal/sensors Sensors 2016, 16, 1364 2 of 17 Sensors 2016, 16, 1364 Sensors 2016, 16, 1364 this paper develops 2 of 17 2 of 17 a multi-vision bevel gear quality inspection device which combines defect detection and dimension measurement in one system. Three efficient image processing algorithms defect detection and dimension measurement in one system. Three efficient image processing defect detection and dimension measurement in Circle one system. Three efficient processing named Neighborhood Average Difference (NAD), Approximation Methodimage (CAM) and Fast algorithms named Neighborhood Average Difference (NAD), Circle Approximation Method (CAM) algorithms named Neighborhood Average Difference (NAD), Circle Approximation Method (CAM) Rotation-Position (FRP) are proposed. The system successfully solves the problem of bevel gear defect and Fast Rotation-Position (FRP) are proposed. The system successfully solves the problem of bevel and Fast Rotation-Position (FRP) proposed. The system successfully solves the article problem of bevel detection and measurement at highare speed and accuracy. find a short describing gear defect detection and measurement at high speed One and can accuracy. One Chinese can find a short Chinese gear defect detection and measurement at high speed and accuracy. One can find a short Chinese this system in [15].this system in [15]. article describing article describing this system in [15]. (a) (a) (b) (b) Figure 1. (a) Samples of bevel gear; (b) Gear measuring instrument of Osaka Osaka Seimitsu Precision Precision Figure Figure 1. 1. (a) (a) Samples Samples of of bevel bevel gear; gear; (b) (b) Gear Gear measuring measuring instrument instrument of of Osaka Seimitsu Seimitsu Precision Machinery Corporation. Machinery Machinery Corporation. Corporation. 2. System Design 2. System Design Design 2. System 2.1. Inspection Project and Requirement Analysis 2.1. 2.1. Inspection Inspection Project Project and and Requirement Requirement Analysis Analysis The defect detection detection anddimension dimension measurementsystem system developed this paper is designed The developed in in this paper is designed for The defect defect detectionand and dimensionmeasurement measurement system developed in this paper is designed for two specific types of bevel gears, A 4 and F 6 , as shown in Figure 2. two specific types of bevel gears, A4 and F6 , as in Figure 2. 2. for two specific types of bevel gears, A4 and F6,shown as shown in Figure (a) (a) (b) (b) (c) (c) (d) (d) Figure 2. Samples of gear A4 and F6: (a) Top surface of gear A4; (b) Bottom surface of gear A4; (c) Top Figure 2. 2. Samples ofofgear 4 and F6: (a) Top surface of gear A4; (b) Bottom surface of gear A4; (c) Top Figure Samples gearAA (a) Top 4 and Fof 6 : gear Bottom surface F6. surface of gear A4 ; (b) Bottom surface of gear A4 ; surface of gear F6; (d) 6; (d) Bottom surface of gear F6. surface of gear F (c) Top surface of gear F6 ; (d) Bottom surface of gear F6 . (1) (1) (1) (2) (2) (2) (3) (3) (3) The requirements of defect detection and measurement are as follows: The requirements of defect detection and measurement are as follows: The requirements of defect detection and measurement are as follows: Defect detection of tooth end surface, spherical surface and mounting surface, including knock Defect detection of tooth end surface, spherical surface and mounting surface, including knock damage, cracks, of insufficient filling, folding, scratches, dents or gibbosity, repeated spline Defect detection tooth end surface, surface and mounting surface, including damage, cracks, insufficient filling, spherical folding, scratches, dents or gibbosity, repeated knock spline broaching, etc. As shown in Figure 3. Defects greater than 0.5 mm × 0.5 mm should be damage, cracks, scratches, or gibbosity, spline broaching, etc. Asinsufficient shown in filling, Figure folding, 3. Defects greater dents than 0.5 mm × 0.5 repeated mm should be detected. broaching, etc. As shown in Figure 3. Defects greater than 0.5 mm × 0.5 mm should be detected. detected. Measurement of bevel gear’s height, diameters of two circumcircles, spline’s big and small Measurement of bevel gear’s height, diameters of two circumcircles, spline’s big and and small small diameters, end rabbet’s diameter. Measurement accuracy is 80–100 μm. diameters, end rabbet’s diameter. Measurement accuracy is 80–100 µm. μm. Defect detection speed is about 1 s for each surface. Defect detection speed is about 1 s for each surface. 2.2. Hardware Design 2.2. 2.2. Hardware Hardware Design Design Bevel gears have many many teeth and and each tooth tooth has different different lateral surfaces. surfaces. For example, example, gear A A4 Bevel Bevel gears gears have have many teeth teeth and each each tooth has has differentlateral lateral surfaces.For For example,gear gear A44 has 13 teeth teeth and 26 26 surfaces; gear gear F6 has 14 14 teeth and and 28 surfaces. surfaces. The defects defects are diverse. diverse. How to has has 13 13 teeth and and 26 surfaces; surfaces; gear FF66 has has 14 teeth teeth and 28 28 surfaces. The The defects are are diverse. How How to to realize the real-time online inspection of all the teeth and surfaces is a big challenge. One approach realize the real-time online inspection of all the teeth and surfaces is a big challenge. One approach is using a single camera and rotational shooting mode, but it requires a high-precision rotary is using a single camera and rotational shooting mode, but it requires a high-precision rotary positioning system. The inspection speed is too low to meet the requirements. Because the bevel positioning system. The inspection speed is too low to meet the requirements. Because the bevel Sensors 2016, 16, 1364 3 of 17 realize the real-time online inspection of all the teeth and surfaces is a big challenge. One approach is using a2016, single camera and rotational shooting mode, but it requires a high-precision rotary positioning Sensors 16, 1364 3 of 17 system. The inspection speed is too low to meet the requirements. Because the bevel gear’s 3teeth Sensors 2016, 16, 1364 of 17 surfaces are uniformly distributed and rotationally symmetric, we propose a solution which can gear’s teeth surfaces are uniformly distributed and rotationally symmetric, we propose a solution gear’s teeth surfaces are uniformly distributed and rotationally symmetric, we propose a solution complete image acquisition and processing synchronously by using multiple high-speed cameras. which can complete image acquisition and processing synchronously by using multiple high-speed which canThe complete image acquisition and processing synchronously by using multiple The speed was an order of faster thanfaster single camera rotary positioning system.high-speed cameras. speed was anmagnitude order of magnitude than single camera rotary positioning system. cameras. The speed was an order of magnitude faster than single camera rotary positioning system. Figure Figure 3. 3. Examples Examples of of defects defects on on bevel bevel gear gear surface. surface. Figure 3. Examples of defects on bevel gear surface. Figure 4 shows the total plan of our inspection system. Nine high speed USB cameras are used Figure 44 shows plan of inspection system. speed cameras areare used to Figure shows the the total total plan ofour our inspection system.Nine Ninehigh high speedUSB cameras used to capture different parts of the bevel gear simultaneously. The top camera isUSB the core of the whole capture different parts of the bevel gear simultaneously. The top camera is the core of the whole image to capture differentsystem. parts ofItthe bevel gearthe simultaneously. The camera is the coredetection of the whole image acquisition undertakes positioning of thetop bevel gear, defect and acquisition system. system. It undertakes the positioning of the bevel gear, defect detection anddetection measurement image acquisition It undertakes the positioning of the bevel gear, defect and measurement of the top and bottom surfaces. of the top and of bottom surfaces. measurement the top and bottom surfaces. Figure 4. Diagram of bevel gear inspection system. Figure Figure 4. 4. Diagram Diagram of of bevel bevel gear gear inspection inspection system. system. Seven side-view cameras are emplaced evenly around the bevel gear, with a 45° tilt angle with Seven side-view cameras emplaced evenly to around the bevel gear,detecting with a 45°◦ tiltlateral angle with respect to the horizontal planeare and perpendicular gear tooth groove, Seven side-view cameras are emplaced evenly around the bevel gear, with a 45the tilt angle teeth with respect to the horizontal plane and perpendicular to gear tooth groove, detecting the lateral teeth surfaces omni-directionally. In addition, one more camera with a horizontal view is located on the respect to the horizontal plane and perpendicular to gear tooth groove, detecting the lateral teeth surfaces Ingear’s addition, oneAfter morethe camera with a horizontal viewgives is located on the platformomni-directionally. to measure the bevel height. gear is loaded, the operator a command platform to the measure the bevel gear’s height. the gear loaded, the operator givesreceives a command that all of cameras acquire images at After the same time.is An industrial computer and that all of the cameras acquire images at the same time. An industrial computer receives and processes the images one by one and gives control signals according to the detected results. To meet processes theaccuracy images one by one andof gives control signalsdetection, accordingthree to the detected results. To meet the different requirements bevel gear defect types of industrial cameras the requirements gear defect detection, and three1.3 types of industrial cameras are different adopted accuracy in this system. They areof3bevel megapixels, 2 megapixels megapixels, respectively. Sensors 2016, 16, 1364 4 of 17 surfaces omni-directionally. In addition, one more camera with a horizontal view is located on the platform to measure the bevel gear’s height. After the gear is loaded, the operator gives a command that all of the cameras acquire images at the same time. An industrial computer receives and processes the images one by one and gives control signals according to the detected results. To meet the different accuracy requirements of bevel gear defect detection, three types of industrial cameras are adopted in this system. They are 3 megapixels, 2 megapixels and 1.3 megapixels, respectively. We can calculate Sensors 2016, represents 16, 1364 that one pixel 40–50 µm according to the gear’s actual size and image size. That4 of is17to say, the accuracy in the bevel gear’s height and diameter measurements is 40–50 µm. We can calculate that one pixel represents 40–50 μm according to the gear’s actual size and image The includes four modules, whichand arediameter camerameasurements module, control andμm. process size. system That is tohardware say, the accuracy in the bevel gear’s height is 40–50 module, I/O module and gear rotary module, as shown in Figure 5. The camera module consists The system hardware includes four modules, which are camera module, control and process of nine high I/O speed industrial cameras and a top ring light. The ringcamera light illuminates the of whole module, module and gear rotary module, as shown in Figure 5. The module consists system. capturecameras every and partaof bevel Thelight control and processing module is nineThe highcameras speed industrial topthe ring light.gear. The ring illuminates the whole system. The cameras captureprocessing every part ofand the output bevel gear. control and processing module is responsible responsible for image of The instructions and inspection results. I/O module for image processing and output of instructions and inspection results. I/O module includes monitor, keyboard, mouse and pedal. The monitor displays the results. Theincludes software is monitor, keyboard, and pedal. Thepedal monitor displays the results.ofThe software is operated operated by mouse andmouse keyboard. The foot starts the inspection each surface. Softwareby button mouse and keyboard. The foot pedal starts the inspection of each surface. Software button starter starter and automatically trigger are also provided in this system. The gear rotation module is made up and automatically trigger are also provided in this system. The gear rotation module is made up of of a stepping motor and a reducer. It rotates the bevel gear precisely in the defect detection of special a stepping motor and a reducer. It rotates the bevel gear precisely in the defect detection of special partsparts and template acquisition section. and template acquisition section. Figure 5. Hardware structure of the bevel gear inspection system. Figure 5. Hardware structure of the bevel gear inspection system. Figure 6 shows the prototype of this multi-camera bevel gear quality inspection system. In order to6 adapt various inspection tasks and objects,bevel a series ofquality positioninspection adjusting elements areorder Figure showstothe prototype of this multi-camera gear system. In designed. The height of top tasks camera andobjects, illumination canofbeposition adjusted.adjusting Each camera can be are adjusted to adapt to various inspection and a series elements designed. slightlyofintop thecamera horizontal For side-view cameras, Each the tiltcamera angle and to theslightly gear arein the The height anddirection. illumination can be adjusted. candistance be adjusted adjustable in a large range, so the system can detect and measure the vast majority of bevel gears in horizontal direction. For side-view cameras, the tilt angle and distance to the gear are adjustable and other gears or parts of similar structure. a large range, so the system can detect and measure the vast majority of bevel gears and other gears or parts2.3. of similar Softwarestructure. Design Based on above hardware, a fast and multifunctional bevel gear inspection software is 2.3. Software Design developed under the MFC and OpenCV platforms. The inspection includes two steps, top and Based aboverespectively. hardware, aThe fastsoftware and multifunctional bevel gear inspection is developed bottomon surface can automatically identify the currentsoftware inspection is the undertop the and OpenCV The inspection top and surface orMFC the bottom of a bevelplatforms. gear, and whether the bevelincludes gear is intwo the steps, right place. Topbottom inspection respectively. can automatically the current is of thesize topmeasurement or the bottom of includesThe two software sub-modules, top surface andidentify lateral surface, each inspection one compose and defect detection. flowchart is shown in Figure a bevel gear, and whetherThe thesoftware bevel gear is in the right place. Top7.inspection includes two sub-modules, top surface and lateral surface, each one compose of size measurement and defect detection. The software flowchart is shown in Figure 7. Sensors 2016, 16, 1364 Sensors 2016, 16, 1364 5 of 17 5 of 17 Sensors 2016, 16, 1364 5 of 17 Figure 6. Prototype of multi-camera bevel gear inspection system. Figure 6. Prototype of multi-camera bevel gear inspection system. Figure 6. Prototype of multi-camera bevel gear inspection system. Figure 7. Software flowchart of bevel gear inspection system. Figure 7. Software flowchart of bevel gear inspection system. Figure 7. Software flowchart of bevel gear inspection system. For a new inspection task, the software runs in order of top →lateral→bottom surface inspection. any fault is found, thethe program marks at order once and outputs the defects and size For aIf new inspection task, software runsit in of top →lateral→bottom surface For a new inspection task, the software runs in order of top → lateral → bottom surface inspection. measurement andis then Themarks software theand bevel gearthe inspection system inspection. Ifresults, any fault found,terminates. the program it at of once outputs defects and size If any fault is found, the and program it at once and outputs thebevel defects and size measurement measurement results, then marks terminates. The software of the gear inspection system Sensors 2016, 16, 1364 6 of 17 Sensors 2016, 16, 1364 6 of 17 results, then terminates. The software of thefour bevel gear inspection system employs a multi-thread employsand a multi-thread technique. It includes modules, which are defect detection, equipment technique. It includes four modules, which are defect detection, equipment calibration, calibration, template acquisition and parameter setting. The equipment calibration template module acquisition and parameter setting. The equipment calibration module calibrates the position, apertures calibrates the position, apertures and focuses of the nine cameras. The template acquisition module and focusesthe of the nine collection cameras. The acquisition completes the parameter sample collection completes sample andtemplate size calibration of amodule standard gear. The setting and size calibration of a standard gear. The parameter setting module adjusts the related parameters module adjusts the related parameters for different kinds of inspection tasks. The defect detection for different kinds inspection defect detection marks thestatistical faults in the gear image, module marks theoffaults in thetasks. gearThe image, and shows module all the size and data. Figure 8 and shows all the size and statistical data. Figure 8 shows the appearance of the software. shows the appearance of the software. (a) (b) (c) (d) Figure 8.8.Soft Softinterface interface of bevel gear inspection (a)detection; Defect detection; (b) Equipment Figure of bevel gear inspection system:system: (a) Defect (b) Equipment calibration; calibration; (c) Template acquisition; (d) Parameter setting. (c) Template acquisition; (d) Parameter setting. 3. Key Algorithms in Defect Detection 3. Key Algorithms in Defect Detection Because the defects of the bevel gears are complex and various, and the inspection time is short, Because the defects of the bevel gears are complex and various, and the inspection time is the general algorithms cannot meet the requirements. In this system, we use multi-threading short, the general algorithms cannot meet the requirements. In this system, we use multi-threading technology and propose several efficient image processing algorithms, which are NAD, CAM and technology and propose several efficient image processing algorithms, which are NAD, CAM and FRP. FRP. The following introduces these key algorithms in detail. The following introduces these key algorithms in detail. 3.1. Neighborhood Average Difference Difference Method Method 3.1. Neighborhood Average From the thepoint point texture, all kinds of defects are shown astwists, waves, twists, undulations or From of of texture, all kinds of defects are shown as waves, undulations or roughness roughness of the surface, thus defects can be detected by analyzing gear texture. Texture of the surface, thus defects can be detected by analyzing gear texture. Texture description operators description operators based include on gray-scale includesmoothness, average, standard based on gray-scale histograms average, histograms standard deviation, third orderdeviation, moment, smoothness, third order moment, consistency and entropy, etc. A large number of experiments consistency and entropy, etc. A large number of experiments show that the combination of third order show thatand thesmoothness combination third order gear moment and smoothness do best in bevelare gear defect moment doofbest in bevel defect detection, but these operators sensitive detection, but these operators are sensitive to outside interference. The threshold setting is complicated and based on massive numbers of experiments and the results cannot show the Sensors 2016, 16, 1364 7 of 17 to outside interference. The threshold setting is complicated and based on massive numbers of experiments and the results cannot show the characteristics of defects directly and comprehensively (such as area). In our system, we present an algorithm called Neighborhood Average Difference (NAD), which can extract the defects entirely and define larger ones as faults, ignoring tiny ones. The steps are as follows: Sensors 2016, 16, 1364 Step 1: Step 2: Step 3: Step 4: Step 5: 7 of 17 Count the number of nonzero pixels in a neighborhood, denoted by N, given the characteristics of defects comprehensively (such as area). In our system, we present an neighborhood rangedirectly is L ×and L pixels, here L = 30. algorithm called Neighborhood Average Difference (NAD), which can extract the defects entirely Calculate the sum of gray values in the neighborhood: S = ∑iL=−01 ∑ Lj=−01 p (i, j), in which p(i,j) and define larger ones as faults, ignoring tiny ones. The steps are as follows: is the pixel value of point (i,j). Step 1: Count the number of nonzero pixels in a neighborhood, denoted by N, given the Calculate the average pixel value in a neighborhood: M = S/N. neighborhood range is L × L pixels, here L = 30. For pointthe p(i,j) inofthe neighborhood, if |p(i,j) − M| = > δ, then p(i,j) the point is ∑ δ is∑a threshold, Step 2: each Calculate sum gray values in the neighborhood: ( , ), in which marked by setting p(i,j) = 255. is the pixel value of point (i,j) . Step 3: the Calculate the average value in a neighborhood: M connected = S/N. Scan marked points pixel in the image. If they are in region and the size of the Step 4: For each point p(i,j) in the neighborhood, if p(i,j) – M > δ, δ is aSingle threshold, then the pointand tiny region is greater than a threshold, then the region is a defect. marked points is marked by setting p(i,j) = 255. connected regions are ignored. Step 5: Scan the marked points in the image. If they are in connected region and the size of the region is greater than a threshold, then the region is a defect. Single marked points and In addition, interference such as the impurities, dust and dirt on the surface can be removed tiny connected regions are ignored. partly by image dilation and erosion processing. In addition, interference such as the impurities, dust and dirt on the surface can be removed partly by image dilation and erosion processing. 3.2. Circle Approximation Method 3.2. Circle Approximation Method Circle detection is a classical problem in image processing [16–19]. To detect the actual and virtual circles on the bevel gear fast accurately, named[16–19]. CAM isTopresented. The main Circle detection is aand classical problemaninalgorithm image processing detect the actual and idea is: on the bevelcenter gear fast and accurately, algorithm CAMit is presented. Draw avirtual circle circles with initialized and radius, thenan expands ornamed contracts slowly and The moves the main idea is: Draw a circle with initialized center and radius, then expands or contracts it slowly center continuously until it is tangent to the circle to be detected. Then keep adjusting its radius and and moves the center continuously until it is tangent to the circle to be detected. Then keep center on condition that they remain tangent. The drawn circle will move closer and closer to the circle adjusting its radius and center on condition that they remain tangent. The drawn circle will move to be detected until they match perfectly. Figure 9 shows this process. closer and closer to the circle to be detected until they match perfectly. Figure 9 shows this process. Figure 9. Circle Approximation Method: Each row shows an example of circle fitting process. The Figure 9. Circle Approximation Method: Eachcircles row shows an example ofthe circle solid circles are to be detected and the dash are regulated to match solidfitting ones. process. The solid circles are to be detected and the dash circles are regulated to match the solid ones. Circle center, radius and rotation angle step are three initial inputs of this method. The center and radius can be approximated only asking the drawn and the target circles have some common Circle center, radius and rotation angle step are three initial inputs of this method. The center and area. The more accurate the initial parameters are, the faster the method gets an idea result. It radius can be the approximated onlyinasking themeasurement, drawn and and the the target have some common area. utilizes positioning result dimension stepscircles are as follows: The more accurate the initial parameters are, the faster the method gets an idea result. It utilizes the Step 1: Extract the edge of the circle (actual or virtual) to be detected by threshold segmentation positioning result in dimension measurement, and the steps are as follows: Step 1: Step 2: Step 3: Step 4: of the original image. Store the edge in target pixel set C. Step 2: Set the initial center (x0, y0) and radius r0 of the drawn circle. Set the rotation angle step θ. Extract the edge of the circle (actual or virtual) to be detected by threshold segmentation of Step 3: Increase or decrease radius r0, r0 = r0 + δr, or r0 = r0 – δr, δr is the change value in each the original image. Store the edge in target pixel set C. adjustment. Step 4: theCalculate the pixel on the drawnr0circle with the current androtation radius. Find thestep θ. Set initial center (x0set , y0C’) and radius of the drawn circle.center Set the angle intersection of C and C’, record is empty, Increase or decrease radius r0 , as r0 X. = Ifr0X + δr , or return r0 = rto0 Step − δ3. , δ is the change value in r r Step 5: If X is non-empty and the number of its elements is Xnum, judge whether Xnum satisfies the each adjustment. matching condition. If satisfies, output the center and radius information, otherwise move Calculate the pixel C’neighborhoods on the drawn circle with the current the center in its set eight and find the position where Xcenter has theand leastradius. elements,Find the intersection of Creturn and C’, record record and to step 3. as X. If X is empty, return to Step 3. Sensors 2016, 16, 1364 8 of 17 Step 5: If X is non-empty and the number of its elements is Xnum , judge whether Xnum satisfies the matching condition. If satisfies, output the center and radius information, otherwise move Sensors 2016, 16, 1364 8 of 17 the center in its eight neighborhoods and find the position where X has the least elements, and return to step 3. The record parameter θ is used in calculation of set X. Figure 10 shows how it is defined. R is the radius. A and B are the starting and ending point of a rotation. Line segment L connects A and B. The parameter θ is used in calculation of set X. Figure 10 shows how it is defined. R is the For computational convenience, the center coordinate is set to (0, 0), B is set to (R, 0), A is set to (x1, radius. A and B are the starting and ending point of a rotation. Line segment L connects A and B. y1), then: For computational convenience, the center coordinate is set to (0, 0), B is set to (R, 0), A is set to (x1 , y1 ), then: x1 R cos , y1 R sin (1) x1 = Rcosθ, y1 = Rsinθ (1) L2 (R R cos )2 R2 sin2 (2) L2 = ( R − Rcosθ )2 + R2 sin2 θ (2) 2 L22) /2R22 ] θ = arccos [(2R2R− arccos[(2 L ) / 2R ] 2 Because the image is discrete, L ≥ 1, then across 2R2R−2 1 ≤ θ ≤ π. Because the image is discrete, L ≥ 1, then across ≤ ≤ . (3) (3) Figure 10. 10. The calculation of rotation rotation angle angle step step θθin inCAM. CAM. Figure The method method counts counts target target pixels pixels on on the the drawn drawn circle circle based based on on the the parameter parameter θ, θ, itit searches searches step step The by step like the second hand of a clock. The value of θ is closely related to the cost of computation. by step like the second hand of a clock. The value of θ is closely related to the cost of computation. Theoretically,the thegreater greaterthe theθ is, θ is, faster algorithm is, but the robustness ofalgorithm the algorithm Theoretically, thethe faster thethe algorithm is, but the robustness of the will will be decreased and the detection accuracy will be reduced at the same time, so the accuracy, detection be decreased and the detection accuracy will be reduced at the same time, so the detection accuracy, the speed execution and the robustness of the algorithm beinto all taken into the execution andspeed the robustness of the algorithm should be should all taken account toaccount set the to set step the angle θ. In addition, θ can be set dynamically. In the early of the algorithm is angle θ. In step addition, θ can be set dynamically. In early of algorithm a bigger aθ bigger is usedθto used to approach the detected circle quickly. θ becomes smaller as soon as the drawn circle touches approach the detected circle quickly. θ becomes smaller as soon as the drawn circle touches the detected the detected is then fitted in aThe small range. The algorithm efficientwith and circle which iscircle thenwhich searched and searched fitted in aand small range. algorithm is efficient andisaccurate accurate with a dynamic theta. a dynamic theta. Similar to to Hough Hough Transform, inin virtue of Similar Transform, our our method methodisisrobust robusttotoimage imagenoise noiseand anddisturbance disturbance virtue cumulative sum. It It has a asimple preprocessing of of threshold threshold of cumulative sum. has simplesearching searchingidea ideaand andonly only requires requires aa preprocessing segmentation. While in Hough Transform, preprocessing of edge detection or skeleton extraction is segmentation. While in Hough Transform, preprocessing of edge detection or skeleton extraction is very time consuming. very time consuming. In order order to to verify verify the the detection detection results results of of our our algorithm, algorithm, we we compare compare itit with with Hough Hough transform transform In by fitting the moon in Figure 11. The results of the two methods are almost identical and cannot be be by fitting the moon in Figure 11. The results of the two methods are almost identical and cannot distinguished by by naked naked eye. eye. Details Details of of the the comparison comparison are are shown shown in in Table Table 1.1. The The data data show show that that distinguished CAM is about 50 times faster than Hough Transform, with little difference in initial parameter CAM is about 50 times faster than Hough Transform, with little difference in initial parameter setting. setting. In addition, the Hough classical Hough Transform cannot used in highimages resolution images In addition, the classical Transform cannot be used in be high resolution because the because thespace parameter matrix will increase dramatically with of thethe increase theresulting image size, parameter matrixspace will increase dramatically with the increase imageof size, in in a large amount ofand computation and However, storage space. However, our circlealgorithm approximation aresulting large amount of computation storage space. our circle approximation is not algorithministhis notaspect. restricted this aspect. is simple and fast withand less andthe is high very restricted It isin simple and fastItwith less computation, is computation, very suitable for suitable for thedetection high precision circle detection situation. precision circle situation. CAM is suitable for fitting circles when a good initialization is available. Hough is suitable for fitting circles in the presence of noise and multiple models (multiple circle arcs). A solution from Hough transform will generally require refinement using an iterative method like CAM. Our method is more suitable for detection of circles which have been positioned approximately. In actually, with the auxiliary system composed by sensors and cameras, the location of bevel gear won’t deviate greatly. The gross location of the circle can be determined by software in advance. So Sensors 2016, 16, 1364 9 of 17 CAM is suitable for fitting circles when a good initialization is available. Hough is suitable for fitting circles in the presence of noise and multiple models (multiple circle arcs). A solution from Hough transform will generally require refinement using an iterative method like CAM. Our method is more suitable for detection of circles which have been positioned approximately. In actually, with the auxiliary system composed by sensors and cameras, the location of bevel9gear Sensors 2016, 16, 1364 of 17 won’t deviate greatly. The gross location of the circle can be determined by software in advance. So our method can be successfully applied to mosttoofmost the industrial circle detection occasions.occasions. The detection our method can be successfully applied of the industrial circle detection The and production efficiency efficiency can be improved significantly with its speed detection and production can be improved significantly withadvantage. its speed advantage. Sensors 2016, 16, 1364 9 of 17 our method can be successfully applied to most of the industrial circle detection occasions. The detection and production efficiency can be improved significantly with its speed advantage. Figure 11. 11. Moon Moon picture picture (952 (952 × × 672 Figure 672pixels) pixels) and and circle circle fitting fitting result. result. Red Red cross cross is is the the center. center. Table 1. Compare Circle Approximation Method with Hough Transform in moon fitting. Table 1. Compare Circle Approximation Method with Hough Transform in moon fitting. Min Max Step Size of Step Aize of fitting result. Red cross is the center. Method Figure 11. Moon picture (952 × 672 pixels) and circle Center Radius/pixel Radius/pixel Angle/rad Radius/pixel Min Max Step Size of Step Aize of Method Center Hough Transform 150 250 Null 0.1 Radius/pixel Radius/pixel Angle/rad Radius/pixel Table 1. Compare Circle Approximation Method with Hough Transform in moon fitting. 1 CAM 150 250 Image center 0.006 0.5 Hough Transform Method CAM 150 Null Step Size of0.1 Step Aize of 1 Min Max250 Center Time/s Radius/pixel Radius/pixel 150 250 Image centerAngle/rad0.006 Radius/pixel 0.5 detection methods, solid circles and arcs.5.6623 But it 150 250 our method Null can detect0.1 1 150 250 Image center 0.006 0.5 0.1189 Time/s Time/s 5.6623 0.1189 5.6623 0.1189 Like other circle can also Hough Transform CAM detect virtual circles such as gear spline circle, inscribed circle and circumscribed circle of a polygon. Like other circle detection methods, ourvirtual method can detect circles and arcs. But it can also Figure 12 Like shows the fitting results of spline circles onsolid gearsolid F6 by our method. other circle detection our method can detect circles and arcs. Butcircle it can also detect virtual circles such as gearmethods, spline circle, inscribed circle and circumscribed of a polygon. detect virtual circles such as gear spline circle, inscribed circle and circumscribed circle of a polygon. Figure 12 shows the fitting results of spline virtual circles on gear F6 by our method. Figure 12 shows the fitting results of spline virtual circles on gear F6 by our method. (a) (a) (b) (b) Figure 12. Fitting the spline circles by CAM: (a) Image of spline; (b) Detection results of inscribed and Figure 12. Fitting the spline circles by CAM: (a) Image of spline; (b) Detection results of inscribed and Figure 12. Fitting the spline circles by CAM: (a) Image of spline; (b) Detection results of inscribed and circumscribed circles of of spline bygreen green circles). circumscribed circles spline(denoted (denoted by circles). circumscribed circles of spline (denoted by green circles). 3.3.Rotation-Position Fast Rotation-Position 3.3. Fast 3.3. Fast Rotation-Position In order to carry out accurate inspection of the bevel gear, the position of the workpiece should In order to carry out accurate inspection of the bevel gear, the position of the workpiece should be determined quickly after loading. It includes horizontal, vertical and rotational position. This In order to carry outafter accurate inspection of the bevel gear, the position the workpiece should be be determined quickly loading. It includes horizontal, vertical andofrotational position. This system uses a mechanical device to realize the horizontal and vertical position. Due to the complex determined quickly after loading. It includes horizontal, vertical and rotational position. This system system uses a mechanical to realize vertical position. to the gear structure and the device short feeding time, the it ishorizontal difficult to and achieve effective rotary Due position by complex uses mechanical device to realize thetime, horizontal and vertical position. Due therotary complex gear astructure and thethis short feeding it is difficult to achieve effective rotary positiongear by hardware. We solve problem by image processing method. Taking into account the to center structure and short feeding time, it is difficult to divided achieve effective rotary position by hardware. symmetry of the bevel gear, the position is into three steps which arethe center hardware. We the solve this problem byrotation image processing method. Taking into account center rotary positioning, image acquisition of tooth surface and the rotary angle We solve this bygear, imagethe processing method. Taking intocalculation: account the steps centerwhich rotary are symmetry symmetry of problem the bevel rotation position is divided into three center of the(1) bevel gear, the rotation position divided into three steps which are positioning, image Center positioning. To cut theisimage of the inner circle, first segment thecenter top camera image positioning, image acquisition ofout tooth surface and the rotary angle calculation: with a certain threshold, then remove interference of impurity and inherent defects by acquisition of tooth surface and the rotary angle calculation: (1) Center positioning. To cut outarea the filting. imageThe of the inner circle, first segment the top image foreground and background circle center and diameter are extracted bycamera area with statistics a certain threshold, then remove interference of impurity and inherent defects by and coordinates averaging. Figure 13a–c show this process. (2) Teeth-end image extraction. Segment Figure 13a again with a smaller threshold value and cut foreground and background area filting. The circle center and diameter are extracted by area out the teeth-endaveraging. image. Remove interference of impurity and inherent defects by statistics andentire coordinates Figure 13a–c show this process. foreground and background area filtering. Then draw two circles with the center calculated (2) Teeth-end image extraction. Segment Figure 13a again with a smaller threshold value and cut above and proper radiuses, extract only the teeth-end surface, as shown in Figure 13d–f. out the entire teeth-end image. Remove interference of impurity and inherent defects by Sensors 2016, 16, 1364 10 of 17 (1) Center positioning. To cut out the image of the inner circle, first segment the top camera image with a certain threshold, then remove interference of impurity and inherent defects by foreground and background area filting. The circle center and diameter are extracted by area statistics and coordinates averaging. Figure 13a–c show this process. (2) Teeth-end image extraction. Segment Figure 13a again with a smaller threshold value and cut out the entire teeth-end image. Remove interference of impurity and inherent defects by foreground and background area filtering. Then draw two circles with the center calculated above and proper Sensors 2016, 16, 1364 10 of 17 radiuses, extract only the teeth-end surface, as shown in Figure 13d–f. (3) (3) Calculate the16,rotation implementtemplate templatematching matching algorithm in defect Calculate rotationangle. angle. In In order order to to implement algorithm Sensors 2016,the 1364 10 ofin 17 defect detection, thethe original shouldbe betransformed transformed a standard position detection, originalimage imageof ofthe thebevel bevel gear gear should to to a standard position to to (3) the Calculate the temple. rotation angle. In order to shows implement template matching algorithm in defect . α. extract the matched temple. Thefollowing following how totocalculate thethe rotary angle extract matched The shows how calculate rotary angle detection, the original image of the bevel gear should be transformed to a standard position to 13following teeth, which evenly. angle For example, gear A4 has . between two extract thebevel The showsare howdistributed to calculate the rotaryThe angle For example, bevelmatched gear Atemple. 4 has 13 teeth, which are distributed evenly. The angle between two teeth teeth is 360°/13 = 27.69°, as shown in Figure 14. O is the center of the gear and P i is the geometric ◦ , as shown 13 teeth, are distributed evenly. two center of For example, bevel in gear A4 has 14. is 360◦ /13 = 27.69 Figure O is which the center of the gear andThe Piangle is thebetween geometric center teeth of aistooth. angleas between x-axis 14. and line i is defined as αi. For each tooth, αi is 360°/13The = 27.69°, shown in Figure O is the OP center of the gear and Pi is the geometric a tooth. The angle between x-axis and line OPi 0°–27.69°. is definedFor as αgear each tooth, αi is converted into the i . For converted firstThe quadrant and within A4, α , α1,…, αtooth, 13. Weαeliminate centerinto of athe tooth. angle between x-axis and line OP i is defined asi =αiα . 1For each i is ◦ –27.69◦ . For gear A , α = α , α , . . . , α . We eliminate any α that has great first any quadrant and within 0 40°–27.69°. 1 For 1 the icalculate into the first quadrant withinand gearmean A4,13αi =ofα1the , α1,…, α13which . We eliminate αi converted that has great difference withand others, rest, isi defined as difference with and calculate thedetected. mean the rest, which is defined as the rotation α of the any αi others, that hasα great difference others,of and calculate the show mean of thethe rest, which is definedangle as the rotation angle of the gear to with be Experiments that rotation-position method rotation angle α of theThe gear to bethat detected. Experiments show the rotation-position method gearprovides to be the detected. Experiments show the rotation-position method excellent excellent accuracy. error is about 0.005°, which canthat meet the provides requirements of theaccuracy. gear provides excellent The error is about 0.005°, which can meet the requirements of the gear The position error is about 0.005◦ , accuracy. which can meet the requirements of the gear position determination. determination. position determination. (a) (a) (b) (c) (b) (d) (e) (c) (f) Figure 13. The image preprocess of FRP method: (a) Top surface image; (b) Segmentation for inner Figure 13.circle The(d) image preprocess of FRP method: (a) Top (d) surface image;for (b)teeth Segmentation (with a larger threshold); (c) Inner circle(e) detected; Segmentation end(f) (with a for inner circle (withsmaller a larger threshold); (c) of Inner circle (f) detected; Segmentation for teeth end (with a smaller threshold); (e) Result area filting; Teeth end(d) image. Figure 13. The image preprocess of FRP method: (a) Top surface image; (b) Segmentation for inner threshold); (e) Result of area filting; (f) Teeth end image. circle (with a larger threshold); (c) Inner circle detected; (d) Segmentation for teeth end (with a smaller threshold); (e) Result of area filting; (f) Teeth end image. Figure 14. Calculation of gear rotation angle. Figure 14. Calculation of gear rotation angle. Figure 14. Calculation of gear rotation angle. Sensors 2016, 16, 1364 11 of 17 Sensors 2016, 16, 1364 11 of 17 Sensors 2016, 16, Matching 1364 3.4. Template Template and Collection Collection 3.4. Matching and 11 of 17 In machine vision, wewe often compare an input imageimage with awith standard image toimage find the 3.4. Template Matching and Collection In machine vision, often compare an input a standard todifference find the or target, which is called template matching. It’s fast and efficient. In this system we use difference or target, which is called template matching. It’s fast and efficient. In this systemtemplate we use In machine vision, we often compare an input image with a standard image to find the matching matching a lot to extract thetoarea to be detected, eliminating the interference of non-inspection areas template a lot extract the area to be detected, eliminating the interference difference or target, which is called template matching. It’s fast and efficient. In this system we useof and improving the inspection speed. non-inspection areas aand inspection template matching lotimproving to extractthe the area to speed. be detected, eliminating the interference of Figure 15a 15a isisthe thetop topsurface surfaceimage imageofofbevel bevelgear gear Figure is the matching template. 4 and Figure A4Aand Figure 15b15b is the matching template. In non-inspection areas and improving the inspection speed. In Figure 15b, the value of background and edge of the gear are set to 150, which means they are Figure 15b, the value of background and edge of the gear are set to 150, which means they are Figure 15a is the top surface image of bevel gear A4 and Figure 15b is the matching template. In a a non-inspection area. Black, white and some other gray colorsindicate indicatedifferent differentparts partstotobebedetected. detected. non-inspection white and some other colors Figure 15b, the area. valueBlack, of background and edge ofgray the gear are set to 150, which means they are a The templates are collected from a standard and perfect gear. Here is the acquisition process of The templates collected a standard and perfect gear. Here is the parts acquisition process of non-inspection area.are Black, whitefrom and some other gray colors indicate different to be detected. top surface surface template. template. First extract extract the the gear’s gear’s center center and and rotation rotation angle, angle, and then then divide divide target target pixels pixels top The templates areFirst collected from a standard and perfect gear. Here and is the acquisition process of into different parts according to their distance to the center and mark with different colors, remove into different parts according to their distance to the andangle, markand withthen different top surface template. First extract the gear’s center andcenter rotation dividecolors, target remove pixels background pixels pixels completely. completely. The templates are numbered and saved in order of rotation angle. background The templates are numbered and saved in order of rotation angle. into different parts according to their distance to the center and mark with different colors, remove background pixels completely. The templates are numbered and saved in order of rotation angle. (a) (a) (b) (b) Figure 4; (b) The matching template. Figure 15. 15. (a) (a) Top Top surface surface of of gear gear A A4; (b) The matching template. Figure 15. (a) Top surface of gear A4; (b) The matching template. The lateral template is collected by the aid of a special gear painted with black and white The lateral is collected bybythethe aidaid of aofspecial geargear painted withwith black and white matted The lateral template template collected a special painted and through white matted coating, shown in is Figure 16a. This ensures that a perfect template can be black extracted coating, shown in Figure 16a. This ensures that a perfect template can be extracted through a simple shown in Figure 16a. ensures that a perfect template be extracted through amatted simplecoating, image processing, shown in This Figure 16b. Figure 16c is an originalcan lateral image and Figure image processing, shown in Figure 16b. Figure 16c is an original lateral image and Figure 16d is the a simple processing, shown Figure 16b. Figure 16c is an original lateral image and Figure 16d is theimage extracted teeth surface byintemplate matching. extracted surface bysurface template 16d is the teeth extracted teeth by matching. template matching. (a) (a) (b) (b) (c) (c) (d) (d) Figure 16. The collection and use of bevel gear lateral template. (a) Lateral image of template gear; Figure 16. collection use lateral (a)(a)Lateral of of template gear; Figure 16. The The collection and useof ofbevel bevel gear lateraltemplate. template. Lateral image template gear; (b) Extracted template; (c)and Lateral image togear be detected; (d) Extracted imageimage by template matching. (b) Extracted template; (c) Lateral image to be detected; (d) Extracted image by template matching. (b) Extracted template; (c) Lateral image to be detected; (d) Extracted image by template matching. In this system the templates collection and processing are completed automatically without In this system the templates collection and processing are completed automatically without artificial participation. automatically numberedare and saved in order of rotation angle In this system theTemplates templatesare collection and processing completed automatically without artificial participation. Templates are automatically numbered and saved in order of rotation angle and camera. After a template is collected, the gear is numbered rotated to another angle order by theofstep-motor and artificial participation. Templates are automatically and saved rotation and angle and camera. After a template is collected, the gear is rotated to another angleinby the step-motor the next collection begins. The angle interval is about 0.01°–0.02°. The total template image size is as andnext camera. After begins. a template collected, theisgear is 0.01°–0.02°. rotated to another angle by theimage step-motor and the collection The is angle interval about The total template size is as ◦ ◦ large as 25 GB, that is to say, the system achieves ideal inspection speed by sacrificing storage the next begins. intervalachieves is aboutideal 0.01 –0.02 . Thespeed total template image storage size is as large as collection 25 GB, that is to The say, angle the system inspection by sacrificing space. large as 25 GB, that is to say, the system achieves ideal inspection speed by sacrificing storage space. space. 4. 4. Inspection InspectionResults Results 4. Inspection Results The and methods methodsare aredifferent differentfor forbevel bevelgears gearsvary varyinin structure. Figures 17a,b The inspection tasks and structure. Figure 17a,b are The inspection tasks and methods are different for bevel gears vary in structure. Figures 17a,b are bottom surfaces of gear F 6 and A 4 respectively. F 6 is flat and defects are obvious in the image. bottom surfaces of gear F6 and A4Arespectively. F6 is flat and defects are obvious in the image. It It is are bottom surfaces of gear F6 and 4 respectively. F6 is flat and defects are obvious in the image. It is easy to detect. A4 is spherical and the image is dark and weak in contrast. The defects are hard to is easy to detect. A4 is spherical and the image is dark and weak in contrast. The defects are hard to be found in the image, especially for dints. The inspection algorithm of A4 is much more be found in the image, especially for dints. The inspection algorithm of A4 is much more complicated. complicated. Sensors 2016, 16, 1364 12 of 17 easy to detect. A4 is spherical and the image is dark and weak in contrast. The defects are hard to be found2016, in the especially for dints. The inspection algorithm of A4 is much more complicated. Sensors 16, image, 1364 12 of 17 Sensors 2016, 16, 1364 12 of 17 (a) (a) (b) (b) Figure 17. 17. Bottom images images of FF6 and and A A4:: (a) 6; (b) Bottom of A4. Figure (a) Bottom Bottom of of F (b)Bottom Bottomof ofA A44. . Figure 17. Bottom Bottom images of of F66 Bottom of FF66;;(b) 4 (a) 4.1. Example of Defect Detection and Dimension Measurement Example of of Defect Defect Detection Detection and Dimension Measurement 4.1. Example In this section, we illustrate how the system works through a practical example, the inspection example, thethe inspection of In this section, we we illustrate illustratehow howthe thesystem systemworks worksthrough througha apractical practical example, inspection of bottom surface of bevel gear A4. bottom surface of of bevel gear A4A . 4. of bottom surface bevel gear 4.1.1. on Bottom Bottom Surface 4.1.1. Inspection Inspection of of Top-Circle Top-Circle on 4.1.1. Inspection of Top-Circle on Bottom Surface Surface The top of back surface is a standard sharp in image the image of a perfect A4 But gear. But it is The top sharp ringring in the of a perfect A4 gear. fragile The top of ofback backsurface surfaceisisa standard a standard sharp ring in the image of a perfect A4 gear.it is But it is fragile andto easily to be damaged in production are as shown as ring, broken ring, burr or and easily be damaged in production process.process. DefectsDefects are shown broken burr or uneven fragile and easily to be damaged in production process. Defects are shown as broken ring, burr or uneven in theThe image. Thewill system will detect theasdefects in Figure 18. The color in color the image. system detect and markand themark defects shown as in shown Figure 18. The following uneven color in the image. The system will detect and mark the defects as shown in Figure 18. The following illustrate the detail inspection process. illustrate the detail inspection process. following illustrate the detail inspection process. Figure 18. Inspection of bottom surface of gear A4, red boxes mark defects and green circle marks Figure 18. Inspection of bottom surface of gear A4, red boxes mark defects and green circle marks Figure 18. Inspection of bottom surface of gear A4 , red boxes mark defects and green circle marks circumcircle. circumcircle. circumcircle. Step 1: Step 1: Step 1: Step 2: Step 2: Step 2: Extract the ring and its neighborhood by threshold segmentation, and then remove small Extract the ring and its neighborhood by threshold segmentation, and then remove small noise around theand ring byneighborhood area filter, the result is shown in Figure 19a. Extract the ring threshold segmentation, and then remove small noise around the ringits by area filter, theby result is shown in Figure 19a. Fitting the ring boundaries. Generally the bevel gear is centrosymmetric, figure out the noise around theboundaries. ring by areaGenerally filter, the result is shown 19a. Fitting the ring the bevel gear in is Figure centrosymmetric, figure out the approximate center by average coordinate. the gear inner and outer boundaryfigure circles of the Fitting the ring boundaries. Generally the Fit bevel centrosymmetric, the approximate center by average coordinate. Fit the innerisand outer boundary circlesout of the ring by CAM. Red circles in Figure 19b are the fitting result. This is preparatory work for approximate by average coordinate. inner and outer circles of the ring by CAM.center Red circles in Figure 19b are Fit thethe fitting result. This isboundary preparatory work for Step 3 and the following dimension measurement. ring by CAM. Red circles in Figure 19b are the fitting result. This is preparatory work for Step 3 and the following dimension measurement. Step 3: Defect detection. The pixels of the top ring are highlighted for a qualified gear, so the dark Step 3 and the following dimension measurement. Step 3: Defect detection. The pixels of the top ring are highlighted for a qualified gear, so the dark circles aretop marked defects. For inside and Step 3: pixels Defect between detection.two Thered pixels of the ring areas forthe a qualified gear,outside so the pixels between two red circles are marked ashighlighted defects. For the inside and outside neighborhood of the two ring,red defects areare located by as thedefects. combination of fixed and dynamic dark pixels between circles marked For the inside and outside neighborhood of the ring, defects are located by the combination of fixed and dynamic threshold segmentation. The merged is shown the white pixel groups Figure neighborhood of the ring, defects areresult located by the as combination of fixed and in dynamic threshold segmentation. The merged result is shown as the white pixel groups in Figure 19c. Then eliminate additional arc by image open-operation and extract bigger defects by threshold segmentation. The merged result is shown as the white pixel groups in Figure 19c. Then eliminate additional arc by image open-operation and extract bigger defects19c. by area filtering, asadditional shown in Figure The end result is marked by red boxesdefects in Figure 18. Then eliminate arc by 19d. image open-operation and extract bigger by area area filtering, as shown in Figure 19d. The end result is marked by red boxes in Figure 18. filtering, as shown in Figure 19d. The end result is marked by red boxes in Figure 18. 4.1.2. Inspection of Spherical Surface 4.1.2. Inspection of Spherical Surface The defects on spherical surface are not obvious in the top camera image and should be The defects on spherical surface are not obvious in the top camera image and should be detected from lateral camera images. Figure 20a shows such an image. The region just facing the detected from lateral camera images. Figure 20a shows such an image. The region just facing the camera is highly lit and both sides are darker and darker. Experiments show that defects in the camera is highly lit and both sides are darker and darker. Experiments show that defects in the Sensors 2016, 16, 1364 transition region Sensors 2016, 16, 1364 13 of 17 are the most obvious, so the main inspection area is between the high light and13the of 17 darkest regions. Sensors Sensors 2016, 2016, 16, 16, 1364 1364 13 13 of of 17 17 transition transition region region are are the the most most obvious, obvious, so so the the main main inspection inspection area area is is between between the the high high light light and and the the darkest regions. darkest regions. (a) (b) (c) (d) Figure19. 19. Defect Defect detection detection of (b)(b) Top ring fitting Figure of top top ring ring on onbottom bottomsurface: surface:(a) (a)Top Topring ringextraction; extraction; Top ring fitting (red circles); (c) Result of defect detection; (d) Defects after open-operation. (red circles); (c) Result of defect detection; (d) Defects after open-operation. To extract the target region fast, the system creates a template for each lateral camera, as 4.1.2. Inspection of Spherical Surface shown in Figure 20b. Gray indicates the inspection region and black/white is the non-inspection (a) (b) (c) (d) (a) (b) (c) (d) The The defects on spherical surface not obvious in the topavoiding camera image and should be detected region. white stripe is used forare device calibration and erroneous inspections. The whole sphere surface cannot Figure betop covered by seven cameras at a The time.extraction; To ensure the defects can be is from lateral camera images. 20a shows such an image. region just facing the camera Figure 19. Defect detection of ring on bottom surface: (a) Top ring (b) Top ring fitting Figure 19. Defect detection of top ring on bottom surface: (a) Top ring extraction; (b) Top ring fitting detected at any angle, theofgear is rotated 3 to 5 times by a stepper highly litcircles); and both darker and darker. Experiments showmotor. that defects in the transition region (red (c) Result detection; (d) after (red circles); (c) sides Resultare of defect defect detection; (d) Defects Defects after open-operation. open-operation. are the most obvious, so the main inspection area is between the high light and the darkest regions. To extract the target region fast, system creates aa template for each camera, as fast, thethe system creates a template for each camera, as shown To extract extractthe thetarget targetregion region fast, the system creates template for lateral each lateral lateral camera, as shown in Figure 20b. Gray indicates the inspection region and black/white is the non-inspection in Figure 20b. Gray indicates the inspection region and black/white is the non-inspection region. shown in Figure 20b. Gray indicates the inspection region and black/white is the non-inspection region. The white stripe used device calibration and avoiding erroneous inspections. The The white stripe is used calibration avoiding erroneous inspections. whole sphere region. The white stripeforis isdevice used for for device and calibration and avoiding erroneous The inspections. The whole sphere surface cannot be covered by seven cameras at a time. To ensure the defects can be surface cannot be covered by seven cameras at a time. To ensure the defects can be detected at whole sphere surface cannot be covered by seven cameras at a time. To ensure the defects canany be detected at any angle, the gear is rotated 3 to 5 times by a stepper motor. angle, theatgear rotated to 5 is times by a3stepper motor. detected any is angle, the3gear rotated to 5 times by a stepper motor. (a) (b) Figure 20. (a) The lateral image of sphere surface; (b) The template of sphere surface. The motor rotation is time-consuming. The motor begins to rotate just at the end of a single image acquisition. The rotation completes when the image processing finishes, then the next image acquisition begins. Inspection time is saved greatly by this process. The defects are detected by the NAD method. Figure 21a shows an image with defects in it and Figure 21b shows the inspection (a) (b) (a) (b) result. Figure 20. (a) The lateral image of sphere surface; (b) The template of sphere surface. Figure Figure 20. 20. (a) (a) The The lateral lateral image image of of sphere sphere surface; surface; (b) (b) The The template template of of sphere sphere surface. surface. The The motor motor rotation rotation is is time-consuming. time-consuming. The The motor motor begins begins to to rotate rotate just just at at the the end end of of aa single single The motor rotation is time-consuming. Thethe motor begins to rotate just atthen the end of a image single image acquisition. The rotation completes when image processing finishes, the next image acquisition. The rotation completes when the image processing finishes, then the next image image acquisition. The rotation completes when theby image process. processing finishes, thendetected the next image acquisition acquisition begins. begins. Inspection Inspection time time is is saved saved greatly greatly by this this process. The The defects defects are are detected by by the the acquisition begins. Inspection time is saved greatly by this process. The defects are detected by the NAD method. Figure 21a shows an image with defects in it and Figure 21b shows the inspection NAD method. Figure 21a shows an image with defects in it and Figure 21b shows the inspection NAD method. Figure 21a shows an image with defects in it and Figure 21b shows the inspection result. result. result. (a) (b) Figure 21. (a) A lateral image of the sphere surface; (b) Inspection result of Figure 21a. 4.1.3. Measurement of End Rabbet Diameter Size measurement includes the diameters of large and small circumcircles and end rabbets. The two circumcircles are relatively easy to fit by threshold segmentation and CAM. The fitting results are shown as green circular groove on the (a) (b) (a) circles in Figure 22a. The end rabbet is a shallow (b) Figure Figure 21. (a) A lateral image of the sphere surface; (b) Inspection Inspection result result of of Figure Figure 21a. 21a. Figure 21. 21. (a) (a) A A lateral lateral image image of of the the sphere sphere surface; surface; (b) 4.1.3. 4.1.3. Measurement Measurement of of End End Rabbet Rabbet Diameter Diameter Size Size measurement measurement includes includes the the diameters diameters of of large large and and small small circumcircles circumcircles and and end end rabbets. rabbets. The two circumcircles are relatively easy to fit by threshold segmentation and CAM. The two circumcircles are relatively easy to fit by threshold segmentation and CAM. The The fitting fitting results are shown as green circles in Figure 22a. The end rabbet is a shallow circular groove on results are shown as green circles in Figure 22a. The end rabbet is a shallow circular groove on the the Sensors 2016, 16, 1364 14 of 17 4.1.3. Measurement of End Rabbet Diameter Size measurement includes the diameters of large and small circumcircles and end rabbets. The two circumcircles are relatively easy to fit by threshold segmentation and CAM. The fitting results Sensors 2016, 16, 1364 14 of 17 are shown as green circles in Figure 22a. The end rabbet is a shallow circular groove on the gear. Sensors 22b 2016,shows 16, 1364its location and 22c is an amplified image. The division of two regions of different 14 of 17 Figure gear. Figure 22b shows its location and 22c is an amplified image. The division of two regions of gray values in Figure 22c is the rabbet circle. It is dark in the image and hard to identify. different gray values in Figure 22c is the rabbet circle. It is dark in the image and hard to identify. gear. Figurethe 22brabbet showscircle its location and 22c in is the an amplified image. The division of two regions Because notdistinct distinct image, fixed dynamic thresholding such asof Because the rabbet circle isisnot in the image, fixed or or dynamic thresholding such as the different gray values in Figure 22c is the rabbet circle. It is dark in the image and hard to identify. the algorithm or the interactive method notapplicable applicable[20–22]. [20–22].Figure Figure23a 23a shows shows the the gray OtsuOtsu algorithm or the interactive method areare not gray Because therabbet rabbet circle is not distinctpixels. in theWe image, fixed or pixels dynamic thresholding such the histogram of the circle and its nearby find that the on the rabbet circle areasthe histogram of the rabbet circle and its nearby pixels. We find that the pixels on the rabbet circle are Otsu algorithm or the interactive method are not applicable [20–22]. Figure 23a shows the gray darkest. They are the of the about 3000–5000 pixels. We extract thesethese pixels and the darkest. They are left the part left part of histogram, the histogram, about 3000–5000 pixels. We extract pixels histogram of the rabbet circle and its as nearby pixels. We find that the pixels on the rabbet circle are then get the outline of the rabbet circle, shown in Figure 23b. and then get the outline of the rabbet circle, as shown in Figure 23b. the We darkest.CAM They are therabbet left part of the histogram, about 3000–5000 pixels. We circle extract these pixels fit fit the circle in Figure 23b. The shown the red Figure We use use CAMtoto the rabbet circle in Figure 23b.result The isresult isas shown as the in red circle22a. in and then get the outline of the rabbet circle, as shown in Figure 23b. We prove again that our method is much faster than the Hough Transform and least-squares method Figure 22a. We prove again that our method is much faster than the Hough Transform and We use CAM to fit the rabbet circle in Figure 23b. The result is shown as the red circle in while ensuring the accuracy. least-squares method while ensuring the accuracy. Figure 22a. We prove again that our method is much faster than the Hough Transform and least-squares method while ensuring the accuracy. (a) (b) (c) Figure 22. 22. Size Size measurement of of circumcircles circumcircles and and (b) end rabbet: rabbet: (a) circles are are circumcircles circumcircles and (a)measurement (c) Figure end (a) Green Green circles and red is end rabbet circle. (b) Location of end rabbet; (c) Extracted rabbet and its amplified image. red is end22. rabbet circle. (b) Location of end rabbet; Extracted andcircles its amplified image. and Figure Size measurement of circumcircles and(c) end rabbet: rabbet (a) Green are circumcircles red is end rabbet circle. (b) Location of end rabbet; (c) Extracted rabbet and its amplified image. (a) (b) Figure 23. (a) Histogram (a) of End rabbet and nearby pixels; (b) Extract the outline (b) of rabbet circle. Figure 23. (a) Histogram of End rabbet and nearby pixels; (b) Extract the outline of rabbet circle. Figure 23. (a) Histogram of End rabbet and nearby pixels; (b) Extract the outline of rabbet circle. 4.2. Statistical Results of Bevel Gear Inspection the Results inspection system 4.2. Using Statistical of Bevel Geardeveloped Inspection in this paper, 84 bevel gears of type A4 and 84 of type 4.2. Statistical Results of Bevel Gear Inspection F6 are tested. Tables 2–4 are the inspection results of A4. Tables 5–7 are the inspection results of F6. Using the inspection system developed in this paper, 84 bevel gears of type A4 and 84 of type the inspection system developedofin our this paper, gears of typeunder A4 andthe 84 current of type The Using data shows that size measurement system84isbevel 100% accurate F6 are tested. Tables 2–4 are the inspection results of A4. Tables 5–7 are the inspection results of F6. F 2–4 are the inspection results . Tables 5–7 are theisinspection F6 . requirements. Measurement accuracy is about 45 of μmA4and repeated error no more results than 90ofμm. 6 are tested. Tables The data shows that size measurement of our system is 100% accurate under the current The data shows that size measurement of our system is 100% accurate under the current requirements. minimum detectable defect is 0.4 mm × 0.4 mm. Inspection time of one surface is less than 1.3 s. requirements. Measurement accuracy is about 45 μm and repeated error is no more than 90 μm. Measurement accuracy about 45 are µm suitable and repeated error is no more thaninspection 90 µm. The Both the accuracy and isthe speed for the real-time on-line in minimum automatic The minimum detectable defect is 0.4 mm × 0.4 mm. Inspection time of one surface is less than 1.3 s. production lines. Both the accuracy and the speed are suitable for the real-time on-line inspection in automatic production lines. Sensors 2016, 16, 1364 15 of 17 detectable defect is 0.4 mm × 0.4 mm. Inspection time of one surface is less than 1.3 s. Both the accuracy and the speed are suitable for the real-time on-line inspection in automatic production lines. Table 2. Top surface inspection of gear A4 . Top Circle Results Teeth Surface Teeth Edge Connected Fault Knock Dent Knock Insufficient Filling Knock Insufficient Filling 1 1 100 7 7 100 10 10 100 15 12 80 1 1 100 13 11 84.6 10 10 100 Number of defects Detected defects Accuracy/% Table 3. Lateral surface inspection of gear A4 . Results Knock Scratch Dent Gibbosity Roughness Peeling off Number of defects Detected defects Accuracy/% 30 28 93.3 23 19 82.6 18 16 88.9 1 1 100 6 6 100 1 1 100 Table 4. Bottom surface inspection of gear A4 . Bottom Ring Results Spherical Surface Spline Knock Scratch Knock Scratch Dent Duplicated Broaching 41 39 95.1 12 12 100 15 15 100 23 22 95.6 19 19 100 6 6 100 Number of defects Detected defects Accuracy/% Table 5. Top surface inspection of gear F6 . Results Number of defects Detected defects Accuracy/% Top Ring Top Teeth Surface Teeth Edge Chamfer Lack Knock Dent Knock Dent Knock Insufficient Filling 2 2 100 7 7 100 8 8 100 17 16 94.1 9 9 100 34 30 88.2 18 18 100 Table 6. Lateral surface inspection of gear F6 . Results Number of defects Detected defects Accuracy/% Lateral Teeth Surface Lateral Teeth Small Surface Knock Dent Scratch Insufficient Filling Knock Dent Scratch 42 37 88.1 8 8 100 11 8 72.7 3 3 100 13 11 84.6 5 5 100 4 3 75 Table 7. Bottom surface inspection of gear F6 . Results Number of defects Detected defects Accuracy/% Bottom Ring Mounting Surface Spline Knock Rough Knock Scratch Size Error Duplicated Broaching 3 3 100 3 3 100 2 2 100 5 4 80 4 4 100 3 3 100 5. Discussion The presented multi-camera automobile bevel gear quality inspection system adopts advanced machine vision technology. It solves the problem of difficult inspection of complicated gears at high Sensors 2016, 16, 1364 16 of 17 speed and with high precision. The research shows that machine vision can substitute most manual work in bevel gear inspection, improving the efficiency of production and the degree of automation. The prototype of this system has been realized and applied in bevel gear production. There are still some technical problems to be solved. The edge of the gear is a non-detectable area in our system and some defects in low contrast images cannot be detected reliably. We are seeking better illumination schemes and new inspection ideas to solve these problems. Acknowledgments: The work was supported by Natural Science Foundation of Shaanxi, China (No. 2012JQ8042) and the grants from National Natural Science Foundation of China (No. 61105021). Author Contributions: All authors contributed extensively to the work presented in this paper and wrote the manuscript. Ruiling Liu reviewed the state-of-the-art and wrote the initial version of the paper. Dexing Zhong and Hongqiang Lyu participated the analysis of the vision techniques. Jiuqiang Han instructed the whole development of the system. Conflicts of Interest: The authors declare no conflict of interest. Abbreviations NAD CAM FRP Neighborhood Average Difference Circle Approximation Method Fast Rotation-Position References 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. Shao, Y.M.; Liang, J.; Gu, F.S.; Chen, Z.G.; Ball, A. Fault prognosis and diagnosis of an automotive rear axle gear using a RBF-BP neural network. In 9th International Conference on Damage Assessment of Structures; Ouyang, H., Ed.; IOP Publishing: Bristol, UK, 2011; Volume 305, pp. 613–623. Parey, A.; Tandon, N. Impact velocity modelling and signal processing of spur gear vibration for the estimation of defect size. Mech. Syst. Signal Process. 2007, 21, 234–243. [CrossRef] Saravanan, N.; Ramachandran, K.I. Fault diagnosis of spur bevel gear box using discrete wavelet features and decision tree classification. Expert Syst. Appl. 2009, 36, 9564–9573. [CrossRef] Mark, W.D.; Lee, H.; Patrick, R.; Coker, J.D. A simple frequency-domain algorithm for early detection of damaged gear teeth. Mech. Syst. Signal Process. 2010, 24, 2807–2823. [CrossRef] Hayes, M. Better Bevel Gear Production. Available online: http://www.gearsolutions.com/article/detail/ 6076/better-bevel-gear-production (accessed on 23 August 2016). Golnabi, H.; Asadpour, A. Design and application of industrial machine vision systems. Robot. Comput. Integr. Manuf. 2007, 23, 630–637. [CrossRef] Sun, T.H.; Tseng, C.C.; Chen, M.S. Electric contacts inspection using machine vision. Image Vis. Comput. 2010, 28, 890–901. [CrossRef] Cubero, S.; Aleixos, N.; Molto, E.; Gomez-Sanchis, J.; Blasco, J. Advances in machine vision applications for automatic inspection and quality evaluation of fruits and vegetables. Food Bioprocess Technol. 2011, 4, 487–504. [CrossRef] Fernandes, A.O.; Moreira, L.F.E.; Mata, J.M. Machine vision applications and development aspects. In Proceedings of the 2011 9th IEEE International Conference on Control and Automation (ICCA), Santiago, Chile, 19–21 December 2011. Carrio, A.; Sampedro, C.; Sanchez-Lopez, J.L.; Pimienta, M.; Campoy, P. Automated low-cost smartphone-based lateral flow saliva test reader for drugs-of-abuse detection. Sensors 2015, 15, 29569–29593. [CrossRef] [PubMed] Huang, K.Y.; Ye, Y.T. A novel machine vision system for the inspection of micro-spray nozzle. Sensors 2015, 15, 15326–15338. [CrossRef] [PubMed] Liu, Z.; Li, X.J.; Li, F.J.; Wei, X.G.; Zhang, G.J. Fast and flexible movable vision measurement for the surface of a large-sized object. Sensors 2015, 15, 4643–4657. [CrossRef] [PubMed] Fremont, V.; Bui, M.T.; Boukerroui, D.; Letort, P. Vision-based people detection system for heavy machine applications. Sensors 2016, 16, 128. [CrossRef] [PubMed] Perez, L.; Rodriguez, I.; Rodriguez, N.; Usamentiaga, R.; Garcia, D.F. Robot guidance using machine vision techniques in industrial environments: A comparative review. Sensors 2016, 16, 335. [CrossRef] [PubMed] Sensors 2016, 16, 1364 15. 16. 17. 18. 19. 20. 21. 22. 17 of 17 Liu, R.; Zhong, D.; Han, J. Bevel gear detection system with multi-camera vision. Hsi-An Chiao Tung Ta Hsueh/J. Xi'an Jiaotong Univ. 2014, 48, 1–7. Qiao, N.; Ye, Y.; Huang, Y.; Liu, L.; Wang, Y. Method of circle detection in pcb optics image based on improved point hough transform. In Proceedings of the 4th International Symposium on Advanced Optical Manufacturing and Testing Technologies: Optical Test and Measurement Technology and Equipment, Chengdu, China, 19 November 2008. Wu, J.P.; Li, J.X.; Xiao, C.S.; Tan, F.Y.; Gu, C.D. Real-time robust algorithm for circle object detection. In Proceedings of the 9th International Conference for Young Computer Scientists, ICYCS 2008, Zhangjiajie, China, 18–21 November 2008. Zhang, M.Z.; Cao, H.R. A new method of circle’s center and radius detection in image processing. In Proceedings of the 2008 IEEE International Conference on Automation and Logistics, Qingdao, China, 1–3 September 2008. Chiu, S.H.; Lin, K.H.; Wen, C.Y.; Lee, J.H.; Chen, H.M. A fast randomized method for efficient circle/arc detection. Int. J. Innov. Comput. Inf. Control 2012, 8, 151–166. Blasco, J.; Aleixos, N.; Molto, E. Computer vision detection of peel defects in citrus by means of a region oriented segmentation algorithm. J. Food Eng. 2007, 81, 535–543. [CrossRef] Tsneg, Y.H.; Tsai, D.M. Defect detection of uneven brightness in low-contrast images using basis image representation. Pattern Recognit. 2010, 43, 1129–1141. [CrossRef] Jiang, H.H.; Yin, G.F. Surface defect inspection and classification of segment magnet by using machine vision technique. Adv. Mater. Res. 2011, 339, 32–35. [CrossRef] © 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).