A binocular vision indoor positioning and mapping method and device

A binocular vision, indoor positioning technology, applied in the field of positioning and navigation, can solve the problems of poor navigation effect, poor relocation ability, and high requirements for scene texture, achieve good accuracy and robustness, and improve accuracy and robustness. Effect

Active Publication Date: 2022-05-20
SOUTHEAST UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technical problem to be solved by the present invention is to solve the problems of high scene texture requirements, poor relocation ability and poor navigation effect in the existing indoor navigation mode, and provide a binocular vision indoor positioning and mapping method and device, through The indoor positioning method based on the fusion of visual and inertial navigation units can perform positioning and map building in complex scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A binocular vision indoor positioning and mapping method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to make the purposes, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be described below with reference to the accompanying drawings. Obviously, the described embodiments are part of the embodiments of the present invention, not all of the embodiments.

[0038] like figure 1 As shown, the present invention designs a binocular vision indoor positioning and mapping method, which can perform multi-sensor fusion positioning and mapping, and realize the functions of positioning, mapping and autonomous navigation in complex scenes. The method includes the following steps :

[0039] Step 1. Collect the left and right images in real time through the binocular vision sensor, and calculate the initial pose of the camera according to the left and right images.

[0040] First, for the left and right images obtained by the binocular vision sensor, if the image brightness is too hi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a binocular vision indoor positioning and mapping method and device, comprising: collecting left and right images in real time, and calculating the initial pose of the camera; collecting angular velocity information and acceleration information in real time, and pre-integrating to obtain the state of the inertial measurement unit Quantity; construct a sliding window containing several image frames, constrain the visual error term between the image frames and the error term of the inertial measurement unit measurement value, conduct nonlinear optimization on the initial pose of the camera, and obtain the optimized Camera pose and inertial measurement unit measurements; build a bag-of-words model for loopback detection, and correct the optimized camera pose; extract left and right image features and convert them into words and match them with the bag-of-words of the offline map. If successful, the solution will be optimized Otherwise, re-collect the left and right images and perform bag-of-words matching. The invention can realize positioning and mapping in an unknown environment, as well as a positioning function in a scene with a built map, and has better precision and robustness.

Description

technical field [0001] The invention relates to a binocular vision indoor positioning and mapping method and device, belonging to the technical field of positioning and navigation. Background technique [0002] At present, indoor robots are increasingly used in large shopping malls, warehouses and households, such as shopping guide robots in shopping malls, intelligent storage robots, and home sweepers. In these application fields, the robot needs to complete autonomous navigation, and to complete the autonomous navigation, the robot first needs to have the function of indoor positioning, that is to say, the robot needs to know its current location information in the scene and the location information of the destination. [0003] At present, there is no mature indoor high-precision positioning solution. The GPS (Global Positioning System, Global Positioning System) solution has a large indoor positioning error; the solution relies on pasting a QR code label in the scene to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G01C21/20G01C21/16
CPCG01C21/206G01C21/165Y02T10/40
Inventor 李冰卢泽张林王亚洲高猛刘勇董乾王刚赵霞
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products