Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time self-generating pose calculation method based on variance component estimation theory

A technology of variance component estimation and calculation method, which is applied in calculation, navigation calculation tools, computer parts and other directions, and can solve problems such as difficulty in ensuring precise solution of pose information

Active Publication Date: 2021-09-28
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when calculating the visual odometry in the above process and using the reprojection error to construct the nonlinear equation to optimize the pose, the influence of the random error of the visual measurement device is not accurately considered, and the calculation process usually only uses empirical information to eliminate Feature points, the nonlinear optimization process only adjusts the variance-covariance propagation based on experience, it is difficult to ensure the precise solution of pose information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time self-generating pose calculation method based on variance component estimation theory
  • Real-time self-generating pose calculation method based on variance component estimation theory
  • Real-time self-generating pose calculation method based on variance component estimation theory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0073] The following is a detailed description of the visual navigation algorithm based on variance component estimation in the present invention:

[0074] Step 1: Use the visual sensor to obtain image measurement information in real time and complete the system initialization; the binocular stereo camera is constructed according to feature point extraction, left and right camera image feature point matching, and unit origin position information to realize system initialization.

[0075] Step 2: After the initialization is completed, use the measurement information to calculate the coordinates of the three-dimensional space points.

[0076] Step 2.1: In the process of using the visual sensor to calculate the pose, directly use the baseline information of the camera to calculate the spatial coordinates of the binocular measurement information, and the calculation method is as follows:

[0077]

[0078] Among them, f is the focal length of the camera, b is the baseline size o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a real-time self-generating pose calculation method based on a variance component estimation theory, and belongs to the technical field of simultaneous localization and map construction. According to the method, firstly, initialization of a system is completed by utilizing vision measurement information, then three-dimensional space coordinates of feature points in an image are calculated according to imaging characteristics of a measurement device, then, clustering processing is performed, weight distribution of measurement information is calculated by utilizing variance component estimation, and finally, calculation results are substituted into a nonlinear optimization equation of the whole system. In the using process, the method is not only suitable for a visual navigation algorithm, but also has an optimization effect on a visual / inertial fusion navigation algorithm. By using the method to optimize the random error in the system, the accuracy and reliability of the system are improved.

Description

technical field [0001] The invention belongs to the technical field of Simultaneous Localization and Mapping (SLAM), and specifically relates to a method for optimizing a random model by using variance-covariance component estimation to improve the accuracy of pose calculation in a visual navigation system. Background technique [0002] In recent years, with the vigorous development of industries such as sweeping robots, unmanned driving, unmanned aerial vehicles, and VR, intelligent navigation and positioning equipment using simultaneous positioning and map construction (SLAM) methods have widely served daily life. The existing visual navigation methods mainly have the following processes: 1. Read the information of the visual sensor to obtain the real-time image information of the camera; 2. The front-end visual odometer uses the obtained image information to calculate the motion of the camera between adjacent images; 3. The back-end Non-linear optimization, using visual o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06K9/62G06F17/16G06F17/11G01C21/20
CPCG06T7/73G06F17/16G06F17/11G01C21/20G06F18/23213
Inventor 周泽波田学海
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products