Feature matching method based on depth re-projection and space consistency

A feature matching and reprojection technology, applied in character and pattern recognition, image data processing, instruments, etc., can solve problems such as unreliable RANSAC results

Active Publication Date: 2019-10-11
SOUTHEAST UNIV
View PDF2 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the noise in the data increases or the offset be

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature matching method based on depth re-projection and space consistency
  • Feature matching method based on depth re-projection and space consistency
  • Feature matching method based on depth re-projection and space consistency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] Below in conjunction with accompanying drawing and specific embodiment the present invention is described in further detail:

[0058] The present invention provides a method based on depth reprojection and spatial consistency feature matching, which improves the accuracy and robustness of feature matching by using the spatial consistency of feature points to more reliably extract the correspondence between frames.

[0059] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail in conjunction with the accompanying drawings.

[0060] figure 1 It is a flow chart of the method of the present invention. Such as figure 1 As shown, the feature matching method based on depth reprojection and spatial consistency proposed by the present invention includes the following steps:

[0061] Step 1: Use the RGBD camera to obtain the color map and depth map in the scene, use the RANSAC ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a feature matching method based on depth re-projection and space consistency, and the method comprises the following steps: obtaining an RGBD image and a depth image through anRGBD camera, generating a three-dimensional point cloud, and extracting a plane through RANSAC; performing feature point extraction on the generated point cloud image by using ORB, and establishing adescriptor; establishing a rough corresponding relation between the two frames through a KNN algorithm; obtaining a more reliable corresponding relation by utilizing the space consistency of featurepoints, carrying out feature matching, giving 3D coordinates, and obtaining reliable matching features through graph optimization. According to the method, the corresponding relationship between the frames is extracted more reliably by utilizing the spatial consistency of the feature points, so that the accuracy and robustness of feature matching are improved.

Description

technical field [0001] The invention relates to the field of autonomous navigation applicable to intelligent robots and unmanned aerial vehicles, in particular to a method based on depth reprojection and spatial consistency feature matching. Background technique [0002] With the continuous development of computer vision technology and the continuous progress of autonomous navigation technology, visual SLAM is becoming a research hotspot in the fields of drones and intelligent robots. The main research goal of SLAM is to locate the device and construct the surrounding three-dimensional map at the same time. Refers to a device equipped with special sensors that estimates its own motion state during the entire motion process and establishes a model of the environment on the basis of no prior information. Visual odometry is to estimate the pose change of the camera relative to its surrounding environment. [0003] In the visual SLAM system, the commonly used camera types are ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/73G06K9/62
CPCG06T7/73G06F18/22
Inventor 张涛张硕骁魏宏宇颜亚雄陈浩
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products