Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular vision/inertia autonomous navigation method for indoor environment

A technology of monocular vision and autonomous navigation, which is applied in the field of visual navigation, inertial navigation, and navigation. It can solve the problems of inaccurate matching results and insufficient robustness, and achieve the effects of simplifying algorithms, improving reliability, and safe and effective autonomous navigation.

Active Publication Date: 2012-05-02
NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
View PDF4 Cites 128 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The common feature point extraction algorithm is the Harris method, but its features are not robust enough for the above-mentioned image transformation point matching; and a wider range of scale invariant feature extraction (SIFT: Scale Invariant Feature Transform) is used in ordinary image matching. Algorithm, which has strong robustness and good real-time performance among similar operators, but there are still some mismatches in the process of this algorithm, and the matching results are not accurate enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular vision/inertia autonomous navigation method for indoor environment
  • Monocular vision/inertia autonomous navigation method for indoor environment
  • Monocular vision/inertia autonomous navigation method for indoor environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The method process of the present invention is as figure 1 As shown, it mainly includes the following steps:

[0024] Step 1: Calibrate the internal parameters of the camera on the carrier, obtain the projection relationship of the spatial feature points from the world coordinate system to the image coordinate system, and perform nonlinear optimization on the distortion of the camera;

[0025] Step 2: Use the camera to collect sequence images, and extract the spatial feature point information in the two frames before and after the sequence images collected by the camera based on the scale-invariant feature extraction algorithm;

[0026] Step 3: Perform initial image matching based on the spatial feature point information obtained in step 2, and obtain the initial matching result; adaptively adjust the impact factor of the scale-invariant feature extraction algorithm in step 2 according to the amount of spatial feature point matching information, and obtain at least 7 pa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a monocular vision / inertia autonomous navigation method for an indoor environment, belonging to the field of vision navigation and inertia navigation. The method comprises the following steps: acquiring feature point information based on local invariant features of images, solving a basis matrix by using an epipolar geometry formed by a parallax generated by camera movements, solving an essential matrix by using calibrated camera internal parameters, acquiring camera position information according to the essential matrix, finally combining the vision navigation information with the inertia navigation information to obtain accurate and reliable navigation information, and carrying out 3D reconstruction on space feature points to obtain an environment information mapto complete the autonomous navigation of a carrier. According to the invention, the autonomous navigation of the carrier in a strange indoor environment is realized with independent of a cooperative target, and the method has the advantages of high reliability and low cost of implementation.

Description

technical field [0001] The invention relates to a navigation method, in particular to a monocular vision / inertial fully autonomous navigation method for indoor environments, and belongs to the fields of visual navigation and inertial navigation. Background technique [0002] With the development of technologies such as micro-unmanned aerial vehicles and small autonomous robots, the use of advanced navigation methods to achieve autonomous navigation driving / flying in indoor environments is a prerequisite for their application. On the basis of this technology, dangerous terrain detection, Anti-terrorist investigation, indoor target search and other tasks are of great significance. [0003] The main elements that need to be considered in order to realize autonomous navigation in an unfamiliar indoor environment are: 1. To overcome the inability to effectively acquire wireless signals such as GPS in indoor environments; Planning, etc.; 3. The load problem of the micro UAV itsel...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/00G01C21/16
Inventor 曾庆化庄曈刘建业熊智李荣冰孙永荣赵伟董良倪磊
Owner NANJING UNIV OF AERONAUTICS & ASTRONAUTICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products