Real-time three-dimensional reconstruction key frame determination method based on position and orientation changes

A real-time three-dimensional, pose change technology, applied in the field of robot and UAV visual autonomous navigation, can solve the problems of not considering the quality of key frame image and positioning quality, not considering the change of camera field of view, wasting system running time, etc., to achieve calculation The effect of low cost, high stability, and low error rate

Active Publication Date: 2015-04-22
NORTHWESTERN POLYTECHNICAL UNIV
View PDF2 Cites 51 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] 1. Without considering the image quality and positioning quality of the current key frame, it is easy to cause low-quality or even wrong frame insertion, which will bring serious consequences to the system;
[0009] 2. In some cases, such as when the camera shakes left and right in the same place, a large number of redundant key frames will be generated, which wastes the running time of the system
[0012] 1. Since the key frame selection method does not consider the field of view changes caused by camera rotation, key frames will not be added when the camera rotates more and pans less;
[0013] 2. It is fixedly required that the positioning quality of each key frame is very high before it is selected as a key frame, but it is impractical in actual situations. If you do not add key frames at key moments in time, it is easy to cause subsequent frames can not be positioned, resulting in system failure

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time three-dimensional reconstruction key frame determination method based on position and orientation changes
  • Real-time three-dimensional reconstruction key frame determination method based on position and orientation changes
  • Real-time three-dimensional reconstruction key frame determination method based on position and orientation changes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] Describe the present invention below in conjunction with specific embodiment:

[0041] For the images captured by the camera in real time, the following four steps are required to determine whether it is selected as a key frame. The first step is originally included in the navigation application. The first two steps provide judgment data for the next two steps, and finally determine the current frame. Whether the frame is a keyframe.

[0042] The first step: current frame positioning. For robot autonomous navigation, only a small number of key frames are required for map creation, while the frequency of positioning is relatively high. For the image frame currently acquired by the camera, the current camera in the map can be obtained by matching the feature points with the existing map. pose information. The obtained pose information can be used for navigation on the one hand, and can also be used as the selection and judgment conditions of key frames. In this process...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a real-time three-dimensional reconstruction key frame determination method based on position and orientation changes. The method is achieved through four steps of current frame positioning, positioning quality evaluation, view difference calculation and judgment on whether key frames are inserted or not. The method has the advantages that the key frames are high in quality, speed and stability, and the insertion of a wrong frame can be effectively prevented by judging the positioning quality of image frames; meanwhile, the view difference concept is put forward, it is ensured that each inserted key frame includes certain new image information through the corresponding view difference threshold values, redundancy between the key frames is effectively reduced, and burdens on map establishment are reduced.

Description

technical field [0001] The invention relates to the field of visual autonomous navigation of robots and unmanned aerial vehicles, in particular to a method for determining key frames of real-time three-dimensional reconstruction based on pose changes. Background technique [0002] Simultaneous Localization and Mapping (SLAM) in the field of robotics, also known as Concurrent Mapping and Localization (CML), was first proposed by Smith, Self and Cheeseman in 1988. Due to its important theoretical and application value, it is considered by many scholars as one of the key technologies to realize a truly fully autonomous mobile robot. After years of development, it has been applied in some robots. It is mainly divided into two branches: methods based on probabilistic models such as EKF-SLAM and Fast-SLAM, and methods based on multi-view geometry, such as Structure from Motion (SfM) in the field of machine vision. The latter can cope with larger scenes, faster movement, positioni...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00
Inventor 布树辉赵勇刘贞报
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products