Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stereo visual calibration method integrating neural network and virtual target

A virtual target and neural network technology, applied in the field of machine vision, can solve the problems of large-scale target production and processing, camera nonlinear distortion calibration model, complexity, etc., achieve powerful nonlinear mapping capabilities, easy production and processing, and solve production and processing difficult effect

Inactive Publication Date: 2016-11-16
HUNAN UNIV OF SCI & TECH
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to avoid the deficiencies in the prior art and provide a stereo vision calibration method combining neural network and virtual target to solve the problems of difficult production and processing of large targets, nonlinear distortion of cameras and complex calibration models

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stereo visual calibration method integrating neural network and virtual target
  • Stereo visual calibration method integrating neural network and virtual target
  • Stereo visual calibration method integrating neural network and virtual target

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] In order to enable those skilled in the art to better understand the technical solution of the present invention, the present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments. And the features in the embodiments can be combined with each other.

[0042] The equipment used in the present invention mainly includes: one set of Global series three-coordinate measuring machine of American Brown & Sharpe Company, and its measurement range is: 700mm × 1000mm × 660mm; , frame rate 130fps; one single-corner target; two tripods; one PC and two GIGE interface data lines, etc. In order to compare and analyze the performance of the method of the present invention, the widely used linear calibration method (DLT) is used as a comparative experiment.

[0043] Concrete implementation steps of the present invention are as follows:

[0044] S1. Construct a three-dimensional virtual target with a scale of 560mm×400...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a stereo vision calibration method combining a neural network and a virtual target, comprising the following steps: S1, using a single corner point target to construct a stereo virtual target, acquiring a corner point image and recording the world three-dimensional coordinates of the corner point during the construction process ; S2, extract the pixel coordinates of the corner points in the image; S3, use the neural network to train the pixel coordinates of the corner points and the world three-dimensional coordinates; S4, input the test samples into the training neural network for three-dimensional reconstruction, and calculate the reconstruction error; S5, Change the number of hidden layer nodes of the neural network to minimize the error. On the one hand, the method of the present invention utilizes a single-corner checkerboard to construct a three-dimensional virtual target with a controllable range, which solves the problem of difficult production and processing of large targets; on the other hand, it uses a neural network to calibrate the camera without establishing complex nonlinear distortion model, the calibration accuracy is significantly higher than the linear calibration method. The invention is practical, simple and easy to operate and has high precision.

Description

technical field [0001] The invention belongs to the field of machine vision, and in particular relates to a binocular positioning method in large-field photogrammetry. Background technique [0002] Photogrammetry technology is widely favored due to its excellent characteristics such as non-contact, high precision and unlimited measurement points. Especially in recent years, the development of camera imaging materials and processing technology has provided the possibility for the wide application of this technology in large-scale industrial structures. [0003] Photogrammetry technology mainly faces the problem of measurement accuracy. The main difficulties of large field of view dual-target calibration are: 1. Due to the existence of nonlinear distortion of the camera, traditional calibration often needs to establish a complex camera model, which increases the calibration to a certain extent. Complicated and difficult; 2. Large-scale high-precision calibration requires corr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
CPCG06T2207/20081G06T2207/20084G06T2207/30208
Inventor 王文韫
Owner HUNAN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products