Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion estimation method based on image features and three-dimensional information of three-dimensional visual system

A technology of 3D vision and motion estimation, applied in the field of motion estimation, which can solve problems such as the influence of the initial value of 3D estimation, the inability to obtain 3D relative estimation, and algorithm failure

Inactive Publication Date: 2014-02-05
余洪山
View PDF1 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the defect of this type of method is that it relies heavily on the selection of feature points. First, the mismatching point set in the feature point set will have a serious impact on the initial value of the 3D estimation; secondly, the algorithm is only effective for scenes with many image feature points. If the scene features When the point set is so sparse that it is impossible to obtain a correct and reliable initial estimate, it will directly lead to the failure of the algorithm, so that the three-dimensional relative estimation cannot be obtained

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion estimation method based on image features and three-dimensional information of three-dimensional visual system
  • Motion estimation method based on image features and three-dimensional information of three-dimensional visual system
  • Motion estimation method based on image features and three-dimensional information of three-dimensional visual system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0062] The 3D vision system involved in the present invention can adopt the 2D / 3D compound camera or Microsoft Kinect 3D camera etc. involved in the invention patent (201310220879.6). The three-dimensional vision system can simultaneously acquire scene two-dimensional color image information I and spatial three-dimensional information D, wherein the two-dimensional color image information I and spatial three-dimensional information D are matched and corresponded one by one according to the pixel coordinates of the two-dimensional color image, that is, the two-dimensional color image I The pixel point I(u,v) and the three-dimensional point cloud D of the uth row and the vth column in u,v (x, y, z) correspond. As an application example, the present invention provides the application effect based on the Kinect three-dimensional camera.

[0063] Such as ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a motion estimation method based on image features and three-dimensional information of a three-dimensional visual system. The method includes the first step of obtaining two-dimensional image information and space three-dimensional information corresponding to two-dimensional image pixels in real time on the basis of the three-dimensional visual system, the second step of controlling collection intervals between adjacent frames through the adoption of an adjacent frame sampling interval self-adaptive regulation method and automatically regulating the sampling intervals of the t+1 moment relative to the t moment according to the dot pair number of the effective matching image features between the frame at the t-1moment and the frame at the t moment, and the third step of synthesizing two-dimensional scene image features and three-dimensional point cloud information and carrying out three-dimensional motion estimation according to the matching point pair number of the current two-dimensional image features of the adjacent frames to obtain high-precision three-dimensional motion estimation. Through the method, on the basis of satisfying accuracy and reliability of the three-dimensional estimation, the calculated amount is effectively reduced, and the method has the advantages of being less in calculated amount, small in limitation to an application scene and the like.

Description

technical field [0001] The invention relates to a motion estimation method based on image features and three-dimensional information of a three-dimensional vision system. Background technique [0002] High-precision and high-reliability 3D motion estimation is one of the research difficulties and hotspots in the field of robotics. Its goal is to calculate the change of the robot's 3D space position at two adjacent moments in real time. The core content of research such as tracking. The traditional inertial navigation system has a simple principle and is widely used in motion estimation, but it has problems such as drift and error accumulation, and its corresponding motion estimation has low accuracy and poor reliability. Compared with inertial navigation systems, vision-based motion estimation does not have the problem of physical drift and has higher stability. At present, motion estimation based on monocular vision cannot obtain the scale information of three-dimensional...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
Inventor 余洪山罗堪蔺薛菲王耀南赵科孙欢万琴朱江段峰代扬
Owner 余洪山
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products