Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual ranging-based simultaneous localization and map construction method

A technology for map construction and visual odometry, applied in image enhancement, image analysis, image data processing, etc., can solve the problems of high computational complexity, restricting the calculation speed of positioning and navigation algorithms, and constraints, so as to reduce computational complexity and eliminate Measurement error, effect of scale up

Active Publication Date: 2016-04-06
北京超星未来科技有限公司
View PDF2 Cites 103 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The simultaneous localization and map construction algorithm based on extended Kalman filtering has been proven to be the best convergent SLAM algorithm at present, but its obvious defect is that the computational complexity is very high, which is proportional to the square of the number of feature points contained in the world map
On the one hand, the computational complexity of the square term limits the scale of the world map, which limits the number of features and the area of ​​the environment; on the other hand, it restricts the calculation speed of the positioning and navigation algorithm.
Considering the characteristics of low load and low power consumption of indoor mobile robots, the realization of traditional SLAM systems and real-time processing speeds on mobile platforms is restricted

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual ranging-based simultaneous localization and map construction method
  • Visual ranging-based simultaneous localization and map construction method
  • Visual ranging-based simultaneous localization and map construction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0037] A simultaneous positioning and map construction method based on visual odometry according to an embodiment of the present invention will be described below with reference to the accompanying drawings.

[0038] figure 1 It is a flowchart of a simultaneous positioning and map building method based on visual odometry according to an embodiment of the present invention. Such as figure 1 As shown, the method includes the following steps:

[0039] Step S1: Collect binocular images through a binocular image capture system, and correct...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a visual ranging-based simultaneous localization and map construction method. The method includes the following steps that: a binocular image is acquired and corrected, so that a distortion-free binocular image can be obtained; feature extraction is performed on the distortion-free binocular image, so that feature point descriptors can be generated; feature point matching relations of the binocular image are established; the horizontal parallax of matching feature points is obtained according to the matching relations, and based on the parameters of a binocular image capture system, real space depth is calculated; the matching results of the feature points of a current frame and feature points in a world map are calculated; feature points which are wrongly matched with each other are removed, so that feature points which are successfully matched with each other can be obtained; a transform matrix of the coordinates of the feature points which are successfully matched with each other under a world coordinate system and the three-dimension coordinates of the feature points which are successfully matched with each other under a current reference coordinate system is calculated, and a pose change estimated value of the binocular image capture system relative to an initial position is obtained according to the transform matrix; and the world map is established and updated. The visual ranging-based simultaneous localization and map construction method of the invention has low computational complexity, centimeter-level positioning accuracy and unbiased characteristics of position estimation.

Description

technical field [0001] The invention relates to the technical fields of computer and electronic information, in particular to a simultaneous positioning and map construction method based on visual distance measurement. Background technique [0002] In recent years, mobile work platforms such as drones and mobile robots have become one of the hotspots of research. These devices have high flexibility and are widely used in disaster relief, geological survey and other scenarios. The main technologies of the robot autonomous navigation system include: building a three-dimensional map of the space environment, self-positioning, route planning, and obstacle avoidance. Among them, the construction of a three-dimensional map of the environmental space and its own positioning are the core of the problem, while route planning and obstacle avoidance rely on the UAV's cognition of the environment and the results of its positioning. [0003] At present, outdoor drones already have matu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
CPCG06T2207/10004
Inventor 谷梦媛郭开元汪玉王文强杨华中
Owner 北京超星未来科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products