Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A fast monocular vision odometer navigation and positioning method combining a feature point method and a direct method

A monocular vision and feature fusion technology, which is applied in the field of navigation and positioning, can solve the technical solutions of using both the fusion feature point method and the direct method for navigation and positioning, the poor robustness of large baseline motions, and the high requirements for camera internal parameters. Average tracking time, improved robustness and stability, improved running frame rate

Active Publication Date: 2019-03-29
GUANGZHOU UNIVERSITY
View PDF6 Cites 53 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Compared with the feature method, the direct method has a faster execution speed because it does not need to extract image features; it is more robust to the photometric error of the image, but it has high requirements for the internal parameters of the camera, and the performance of the algorithm decreases rapidly when there is geometric noise; Camera positioning can still be achieved in the case of image motion blur, but the robustness to large baseline motion is poor
[0004] In the prior art, there is no technical solution for using the fusion feature point method and the direct method at the same time for navigation and positioning, so how to overcome the difficulties, based on the characteristics of the feature point method and the direct method, the fusion of the feature point method and the direct method is an issue in this field. One of the research directions of technicians

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A fast monocular vision odometer navigation and positioning method combining a feature point method and a direct method
  • A fast monocular vision odometer navigation and positioning method combining a feature point method and a direct method
  • A fast monocular vision odometer navigation and positioning method combining a feature point method and a direct method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0070] like figure 1 As shown, the present embodiment is a fast monocular visual odometer navigation and positioning method that combines the feature point method and the direct method, including the following steps:

[0071] S1. Turn on the visual odometry and acquire the first frame of image I 1 , converted into a grayscale image, and ORB feature points are extracted, and an initialization key frame is constructed.

[0072] S2. Determine whether it has been initialized; if it has been initialized, go to step S6, otherwise go to step S3.

[0073] S3. Define a reference frame and a current frame, extract ORB features, and perform feature matching.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fast monocular vision odometer navigation and positioning method fusing a feature point method and a direct method, which comprises the following steps: S1, starting the vision odometer and obtaining a first frame image I1, converting the image I1 into a gray scale image, extracting ORB feature points, and constructing an initialization key frame; 2, judging whether thatinitialization has been carry out; If it has been initialized, it goes to step S6, otherwise, it goes to step S3; 3, defining a reference frame and a current frame, extracting ORB feature and matchingfeatures; 4, simultaneously calculating a homography matrix H and a base matrix F by a parallel thread, calculating a judgment model score RH, if RH is great than a threshold value, selecting a homography matrix H, otherwise selecting a base matrix F, and estimating a camera motion according to that selected model; 5, obtaining that pose of the camera and the initial 3D point; 6, judging whetherthat feature point have been extracted, if the feature points have not been extracted, the direct method is used for tracking, otherwise, the feature point method is used for tracking; S7, completingthe initial camera pose estimation. The invention can more precisely carry out navigation and positioning.

Description

technical field [0001] The invention belongs to the technical field of navigation and positioning, and relates to a fast monocular visual odometer navigation and positioning method that combines feature point methods and direct methods. Background technique [0002] SLAM (simultaneous localization and mapping, real-time positioning and map construction) refers to the robot moving from an unknown position in an unknown environment, positioning itself according to position estimation and maps during the moving process, and building augmented reality based on its own positioning. Quantitative maps to realize autonomous positioning and navigation of robots. As an important part of the visual SLAM method, visual odometry largely determines the accuracy and speed of the visual SLAM method. [0003] Visual odometry mainly adopts two calculation methods: feature point method and direct method. The feature method first extracts image feature points and descriptors, and then calcula...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/80G06T7/33G01C22/00
CPCG06T7/33G06T7/80G01C22/00G06T2207/20016
Inventor 朱静汪程辉吕鹏浩苏启彬花明吴羽姚佳岷
Owner GUANGZHOU UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products