Specific implementation plan
 The present invention will be further described in detail below in conjunction with the drawings:
 Such as figure 1 As shown, a device for detecting impurities after beverage filling, the device includes a light source 1 and a camera 3, the light source 1 and the camera 3 are matched with the product to be inspected 2; the camera 3 is connected to the target segmentation unit of the processing device 9 4; The target segmentation unit 4, the information extraction unit 5, the information matching unit 6, the trajectory acquisition unit 7 and the product quality analysis unit 8 of the processing device 9 are sequentially connected; the product analysis unit 8 is also connected to the control outside the processing device 9 Device 10 is connected.
 The image collection comes from several frames of the bottle body image of the beverage after being inverted by the mechanical device extracted by the industrial camera. The lighting scheme is backlighting, and the image is a gray-scale image or a color image that has undergone gray-scale processing.
 Suppose the total number of frames of images collected by each product 2 to be inspected in the system is N.
 As attached figure 2 As shown in the image processing module, the first step of preprocessing the image after receiving a frame of image: enhancing the contrast. It is worth noting that before the initial processing of the image, the original image should be backed up to facilitate the subsequent correction of the connected domain.
 Schemes used to enhance contrast:
 a. Extract the pixel gray information of each line of image, including the maximum gray value and the minimum gray value;
 b. Analyze the gray information of each line of image, and do not process the lines with small difference between the maximum gray value and the minimum gray value;
 c. Analyze the grayscale information of each line of image, and take the minimum grayscale value for the line with a large difference between the minimum grayscale and the minimum grayscale of the previous line;
 d. The histogram stretching algorithm is used to enhance the contrast of the lines to be processed.
 After the contrast enhancement processing, the visible objects in the image detection area become more obvious, but there are still many interferences in the image. In order to eliminate the interference in the image, the image must be filtered. The present invention adopts Gaussian filtering.
 As attached figure 2 As shown, the image is filtered after region segmentation, and the method of image filling area after edge detection is adopted. The edge detection adopts canny edge detection. The image after edge detection is a binary image. The method of image filling area includes the following steps:
 a. Take the edge area in the image after edge extraction as the foreground, and other areas as the background;
 b. Scan the image detection area line by line, and fill the area between the edge information whose distance is less than the maximum filling distance s in each line as the foreground, and the rest as the background;
 c. The image detection area is scanned column by column, and the area between the edge information whose distance is less than the maximum filling distance s in each column is filled as the foreground, and the remaining areas are used as the background.
 There are many connected domains representing visible objects in the image after the area is filled. The present invention uses the region growing method to extract the connected domain area, perimeter, and minimum circumscribed rectangle position information, and store it. Because of the Gaussian filtering used in the preprocessing process, the area The area filling used in the segmentation is easy to connect the visible objects at a relatively close distance, so the connected domain must be corrected. Attached image 3 What is drawn is the algorithm flow of connected domain correction.
 The connected domain analysis steps are as follows:
 a. Use the region growing method to analyze the connected domain of the image detection area, and extract the existing connected domain information, including area, perimeter, left and right positions of the smallest bounding rectangle;
 b. Analyze the information of each connected domain, the area is smaller than the set minimum analysis area area al and the ratio of the perimeter square to the area is greater than the set coefficient p (the bubbles in the detection liquid have the characteristics of small area and large roundness) The information is deleted without storage;
 c. The information of other connected domains (including area, square perimeter to area ratio, and two-dimensional coordinates of the smallest bounding rectangle of the area) are numbered and stored in the frame connected domain information, and the matching state of each connected domain is marked as unmatched.
 The steps to modify the connected domain are as follows:
 a. Find the original connected component information that meets the connected component correction condition (the minimum bounding rectangle width or height is greater than the set minimum value rl and both width and height are less than the set maximum value r2). If not found, go to step f;
 b. Extract the pre-processed image data of the smallest circumscribed rectangular area of the original connected domain;
 c. Use the maximum between-class variance method to extract the image threshold to segment the image;
 d. Use seed method and region growth method to extract new connected domain information, if the number of connected domains is 1, go to step a;
 e. Update the original connected domain information with the new connected domain information, and go to step a;
 f. End connected domain correction.
 After the connected domain is corrected, the connected domain information of all visible objects in the image will be obtained.
 After the connected domain is corrected, check whether the image of the product to be inspected 2 has been collected, if there are still unfinished frames to be collected, wait for the arrival of the next frame, otherwise, if N frames of images have been collected, then extract the collected N frames The obtained connected domain structure information is matched, with Figure 4 Matching flowchart.
 Before matching, record the matching information of all connected domains as unmatched, and the matching coefficient is zero. The matching coefficient records the matching of each connected domain with the connected domains in the previous and subsequent frames, including whether it matches the connected domain of the previous frame, and is connected to the next frame Whether the domain matches, the matching coefficient of the connected domain with the previous frame and the matching connected domain number, the matching coefficient of the connected domain with the next frame and the matching connected domain number, the query and update of the matching coefficient during the matching process Matching situation, the subsequent frame query updates the matching situation with the previous frame, and each connected domain has its own matching connected domain number.
 The specific matching steps are as follows (n is a natural number greater than zero and less than N, and the initial value of n is 1):
 a. Look for the area where no match is found in the nth frame, if it can be found, go to step b), otherwise go to step e);
 b. For the unmatched connected domain a in the nth frame, calculate the matching coefficients with all connected domains in the n+1th frame, and store the matching coefficients in the matching coefficient array. Each item in the array records the matching coefficient and connected domain number. The calculation formula of the matching coefficient factor is:
 Among them, wfactor connected domain width comparison coefficient, hfactor connected domain height comparison coefficient, whfactor connected domain width-to-height ratio comparison coefficient, denfactor connected domain perimeter square area comparison coefficient, areafactor connected domain area comparison coefficient, distancefactor connected domain distance coefficient;
 wfactor=min(wn,wn1)/max(wn,wn1); where wn is the width of the connected domain a, wn1 is the width of the connected domain x in the n+1th frame, and min(wn,wn1) represents wn and wn1 The minimum value of wn, max(wn,wn1) represents the maximum value of wn and wn1;
 hfactor=min(hn,hn1)/max(hn,hn1); where hn is the height of the connected domain a, and hn1 is the height of the connected domain x in the n+1th frame;
 whfactor=min(min(wn,hn)/max(wn,hn),min(wn1,hn1)/max(wn1,hn1))/max(min(wn,hn)/max(wn,hn),min (wn1,hn1)/max(wn1,hn1));
 among them
 densityn=girthn*girthn/arean; where girthn is the perimeter of connected domain a, and area is the area of connected domain a;
 densityn1=girthn1*girthn1/arean1; where girthn1 is the perimeter of the connected domain x in the n+1th frame, and area1 is the area of the connected domain x in the n+1th frame;
 distancefactor=(MAX_DISTANCE-distance)/MAX_DISTANCE; where MAX_DISTANCE is the maximum active distance of the connected domain, and distance is the distance between the connected domain a and the smallest bounding rectangle center of the connected domain x in the n+1th frame;
 c. Find the connected domain x1 that has the largest matching coefficient with connected domain a found in the matching coefficient array;
 d. If the maximum matching coefficient found is less than the limited minimum reasonable matching coefficient f, the subsequent frame matching state of this connected domain a is marked as matched, and the return to step a; if the connected domain x1 is matched for the first time, the connected domain a and connected domain are updated For the matching information of x1, change the matching status of connected domain a and connected domain x1 to matched, record the matched connected domain number and matching coefficient, and return to step a); if connected domain x1 was previously matched by connected domain b in the nth frame, Then compare the matching coefficient with the previously stored matching coefficient. If this matching coefficient is greater than the previously stored matching coefficient, update the matching connected domain and matching information of connected domain x1, modify the matching information of connected domain a to match, and match connected The domain is connected domain x1, record the matching coefficient, and modify the subsequent frame matching state of connected domain b to unmatched, and return to step a); if the matching coefficient is less than or equal to the previously stored matching coefficient, then connect connected domain a and connected The matching coefficient of domain x1 withdraws from the matching coefficient array and returns to step c) to continue searching for a reasonable match;
 e. Increase the value of n by 1, such as n> N-1 ends the match, otherwise return to step a.
 After the matching is completed, the present invention uses the matching result (the matching result records the matching connected component number and matching coefficient of each connected component in the preceding and following frames) to find the position of each connected component in each frame, and represents the connected component according to the two-dimensional position coordinates. Domain motion trajectory, the specific implementation steps are as follows: (Assuming that the total number of frames N is 3):
 a. Find the matched connected domain a2 of a connected domain a1 in the second frame in the first frame that has not found a track, and use two-dimensional coordinates to record the connected domain a1 and the connected domain a2 in the second frame to the connected domain a1 Trajectory expression array;
 b. Find the matched connected domain a3 of the connected domain a2 in the third frame in the second frame, and record the position of the connected domain a3 in the third frame into the connected domain a1 trajectory expression array with two-dimensional coordinates;
 c. If all the connected domain trajectories in the first frame have been searched, it ends, otherwise, return to step a.
 After obtaining the trajectory of the connected domain shown by the visible object in the image, calculate the perimeter square to the area ratio of the connected domain formed by the visible object, where circular represents the ratio of the square perimeter of the connected domain to the area, and s represents the perimeter of the connected domain, d represents the area of the connected domain.
 Use the structural information of connected domains and motion trajectories to identify the nature of visible objects. The identification criteria are as follows:
 (1) If the circularity of the connected domain is greater than the maximum bubble perimeter area ratio mcircular, the visible object represented by this connected domain is judged to be a foreign object;
 (2) The movement direction of the connected domain is erratic, and the visible object represented by this connected domain is judged as a foreign object;
 (3) If the moving direction of the connected domain is opposite to that of the bubble, the visible object represented by this connected domain is judged to be a foreign object;
 Finally, the eligibility of the product 2 to be inspected is judged according to the nature of all visible objects of the product 2 to be inspected. If there are foreign objects in the visible object, the product 2 to be inspected is judged to be unqualified, otherwise it is judged to be qualified, and the test result is finally submitted For the control equipment 10 in the field, the control equipment 10 completes the rejection of the unqualified products.