Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Weight adaptive pose estimation method based on point-line fusion

A pose estimation and self-adaptive technology, applied in computing, image analysis, image enhancement and other directions, can solve the problems of uneven feature distribution, uneven feature distribution, affecting the effect of pose estimation, etc., to improve the description accuracy, The effect of improving descriptiveness, improving adaptability and robustness

Pending Publication Date: 2019-12-13
HEBEI UNIV OF TECH
View PDF4 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the common weight distribution algorithm, by counting the number of point features and line features in each frame, the weights of points and lines are assigned according to the number of point and line features in each frame, so as to achieve the effect of weight distribution, but it cannot Solve the impact of uneven feature distribution on pose estimation
The document with the application number 201810213575.X discloses a fast and robust RGB-D indoor 3D scene reconstruction method, which uses an RGB-D camera as a sensor and realizes feature extraction and description matching of the environment through a point-line fusion algorithm. To a certain extent, the stability of the algorithm in complex environments is improved, but there is still the problem of uneven distribution of features, which seriously affects the effect of pose estimation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Weight adaptive pose estimation method based on point-line fusion
  • Weight adaptive pose estimation method based on point-line fusion
  • Weight adaptive pose estimation method based on point-line fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be further described below in conjunction with the embodiments and accompanying drawings. The specific embodiments are only used to further describe the present invention in detail, and do not limit the protection scope of the claims of the present application.

[0028] The present invention provides a kind of weight adaptive pose estimation method (abbreviation method) based on point-line fusion, it is characterized in that the method comprises the following steps:

[0029] Step 1. Use the binocular camera to collect images to obtain a continuous image sequence: use the binocular camera to collect scene information in the environment, obtain real-time environmental information, and form a continuous image sequence;

[0030] Step 2: Perform feature extraction and processing on the image to obtain the total number of midpoint features and line feature endpoints n in each frame f and its pixel coordinate set C j :

[0031] (1) Use the ORB alg...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a weight adaptive pose estimation method based on point-line fusion. According to the method, a point-line fusion algorithm is used for carrying out feature extraction and matching on the environment, and description of the environment is achieved. By using the point-line fusion algorithm, the adaptability and robustness to the environment are improved, and the descriptiveproperty to the environment is also improved while the environment characteristics can be stably extracted in various complex environments. A region division and region growth mode is adopted to adaptto distribution conditions and density degrees of point features and line feature endpoints, and then weight distribution of the point features and the line feature endpoints in a grid after growth is adjusted in a self-adaptive mode, so that the influence of uneven feature distribution on pose estimation is reduced to the greatest extent. According to the method, the reprojection error of the line feature is calculated by calculating the distances from the two endpoints of the projection line segment to the corresponding detection straight lines, so that the calculation of the reprojection error of the line feature is divided into two parts, the two endpoints do not interfere with each other in different grids, and the description precision of the reprojection error of the line feature is improved.

Description

technical field [0001] The invention belongs to the field of image processing and visual positioning, in particular to a weight adaptive pose estimation method based on point-line fusion. Background technique [0002] With the development of robot technology, vision-based simultaneous localization and mapping (SLAM) technology has gradually become a research hotspot in the key technology of robots. Visual SLAM technology uses the information extracted by the visual sensor for simultaneous positioning and map creation, and obtains the pose trajectory and environmental information of the robot movement, which plays an increasingly important role in the robot navigation system. At present, the calculation process of visual odometry can be completed in real time by using different types of cameras including monocular cameras, binocular cameras, or RGB-D cameras. [0003] In order to adapt to feature extraction and matching in low-texture structured scenes, point-line fusion is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06T7/11G06T7/187
CPCG06T7/74G06T7/11G06T7/187G06T2207/10021
Inventor 张建华周有杰李辉薛原赵岩何伟张霖
Owner HEBEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products