Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor positioning and navigation method based on UWB fused visual SLAM

An indoor positioning and vision technology, applied in the directions of surveying and mapping, navigation, navigation, navigation calculation tools, etc., to achieve the effect of rapid accumulation of errors, reduction of time consumption, and increase of recall rate and accuracy rate

Active Publication Date: 2020-02-28
HANGZHOU DIANZI UNIV
View PDF7 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

With the development of computer vision today, there is still no feature extraction and matching algorithm that satisfies visual SLAM in terms of performance and speed.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor positioning and navigation method based on UWB fused visual SLAM
  • Indoor positioning and navigation method based on UWB fused visual SLAM
  • Indoor positioning and navigation method based on UWB fused visual SLAM

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] The present invention will be further described below in conjunction with drawings and embodiments.

[0034] Such as figure 1 As shown, a method of indoor positioning and navigation based on visual SLAM fused with UWB, the specific implementation steps are as follows:

[0035] Step 1. Take the initial position of the robot as the origin, and the initial orientation as the X coordinate axis, establish a world coordinate system, and select three locations in the room to set up the base station. The robot carries the RGB-D camera and the signal sending and receiving device to move according to the set route, and takes color pictures and depth pictures of the environment frame by frame according to the camera. During the movement of the robot taking pictures, the UWB trilateral positioning method is used to obtain and record the coordinates of the camera taking pictures that change with time in the world coordinate system according to the positional relationship between th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor positioning and navigation method based on UWB fused visual SLAM. The method comprises the steps that a point cloud map is constructed from environment pictures obtained by a sensor, ORB characteristic points in the two adjacent frames of pictures are matched to obtain pose of the adjacent pictures, position information of the sensor is recorded by UWB positioning, the sensor is moved according to certain path to obtain the pose, the map is constructed, coordinates are recorded each time the position is switched, loop picture similarity detection is facilitated by determining whether a present position obtained by a TOF algorithm is the same with the last position, an offset is calculated, the offset is corrected according to positional relation of the TOFalgorithm, a robot pose is adjusted, and the point cloud map is corrected. The method can be used to realize effective navigation and positioning in the normal environment as well as in the complex indoor environment, and a 3D environment model can be established highly accurately.

Description

technical field [0001] The invention relates to unloaded communication technology and visual SLAM technology, in particular to a method for indoor positioning and navigation based on visual SLAM fused with UWB. Background technique [0002] In the field of indoor scene modeling, increasingly mature technologies such as computer vision, data fusion, visual navigation and 3D modeling provide theoretical basis and technical support for 3D modeling of indoor scenes. Vision-based 3D modeling technology has attracted the attention of many researchers in recent years. From a large amount of 2D data, we can analyze the multi-view geometry [3, 4, 5, 6] and 3D structure of a scene. modeling. At present, real-time systems for 3D scene reproduction have been successfully developed, such as SLAM (Simultaneous Localization And Mapping) system using binocular cameras, PTAM (Parallel Tracking And Mapping) system for tracking based on SLAM, etc. With the emergence of various sensors, scene...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G06T7/73
CPCG01C21/206G06T7/74G06T2207/10016G06T2207/10024
Inventor 颜成钢张旗桂仲林宋家驹孙垚棋张继勇张勇东
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products