Unlock instant, AI-driven research and patent intelligence for your innovation.

Identification method of motion blur code points based on convolutional neural network

A convolutional neural network and motion blur technology, applied in the field of identification of motion blur coding points, can solve the problems of restricted aperture range, image motion blur, image quality degradation, etc., achieve fast and reliable early data, speed up recognition, The effect of expanding the application field

Active Publication Date: 2019-08-16
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the result of increasing the sensitivity is that the signal-to-noise ratio is reduced, and serious noise will lead to a reduction in image quality, which is not conducive to post-processing
The result of increasing the aperture is that the depth of field becomes shallower, and the out-of-focus blur becomes more serious. Moreover, the aperture range of the lens is limited by the physical structure and manufacturing cost, which is generally very limited.
[0006] Therefore, the two methods of reducing the exposure time and increasing the aperture cannot solve the motion blur problem of the image when the measured object is in a high-speed motion state.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Identification method of motion blur code points based on convolutional neural network
  • Identification method of motion blur code points based on convolutional neural network
  • Identification method of motion blur code points based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] In order to facilitate the understanding of those skilled in the art, the present invention will be further described below in conjunction with specific embodiments and accompanying drawings, and the contents mentioned in the implementation modes are not intended to limit the present invention.

[0059] Step 1. Calibrate the real-shot camera, construct virtual camera code points, and obtain a large number of motion blur code point images-code point identity samples;

[0060] Step 1.1. Calibrate the camera used for real shooting, determine the internal parameter matrix K of the camera, and denote the image plane as π 1 ;

[0061] Step 1.2, determine the spatial motion area Ω of the measured object under the camera coordinate system;

[0062] Step 1.3, determine the side length l of the coded marking point; determine the serial number set M of the coded marking point to be used, and prepare the graphic I of the corresponding coded marking point m , where m∈M;

[0063] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an identification method for a motion blurred coding point based on a convolutional neural network. The method comprises specific steps: 1, an actual photographing camera is calibrated, virtual camera coding points are built, and a large amount of motion blurred coding point image-coding point identity samples are acquired; 2, a convolutional neural network MBCNet is built; 3, the motion blurred coding point image-coding point identity sample set is used for training and testing the convolutional neural network MBCNet; and 4, the convolutional neural network MBCNet after training is used for carrying out segmentation and classification on the motion blurred image, and a corresponding coding mark point identity ID can be acquired. The identification method can process the actually-photographed motion blurred image, the corresponding coding mark point identity can be obtained, quick and reliable prophase data can be provided for machine vision measurement on a high-speed motion object, and the application field of the machine vision measurement method is expanded.

Description

technical field [0001] The invention belongs to the field of machine vision measurement, in particular to an identification method of motion fuzzy coding points based on a convolutional neural network. Background technique [0002] Coded markers are widely used in machine vision-based industrial metrology and reverse engineering. The identification of coded markers is the basis for subsequent processing. Existing coded marker identification methods are all based on clear images captured still. The recognition method generally includes the following steps: 1) Image noise reduction, 2) Edge detection, 3) Select a closed curve close to an ellipse from the edge as a candidate for the center of the encoding point, 4) Perform geometric transformation on the local image around the ellipse, It can be transformed into a circle or a rectangle. 5) In the transformed image, it is judged as 0 or 1 according to the gray level of the pixel. 6) It is decoded according to the binary string...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/215G06T7/80
CPCG06T2207/20081G06T2207/20084G06T2207/30196
Inventor 周含策张丽艳陈明军
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS