Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A semantic information fused visual SLAM loopback detection method and device

A technology of semantic information and detection methods, applied in geographic information databases, structured data retrieval, instruments, etc., can solve problems such as insufficient purpose, and achieve the effect of accurate positioning and map construction.

Pending Publication Date: 2019-05-03
FOSHAN UNIVERSITY
View PDF10 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In order to achieve the same effect as the human eye with vision-based SLAM technology, you can judge your own position as long as you look around and identify objects. However, the current algorithm based on feature points and pixels is obviously far from such a purpose.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A semantic information fused visual SLAM loopback detection method and device
  • A semantic information fused visual SLAM loopback detection method and device
  • A semantic information fused visual SLAM loopback detection method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The technical solutions of the present invention will be clearly and completely described below in conjunction with the accompanying drawings. Apparently, the described embodiments are some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0053] Such as figure 1 As shown, the embodiment of the present invention provides a visual SLAM loopback detection method that fuses semantic information. The Turtlebot 2 mobile robot is equipped with a robot operating system (Robot Operating System, ROS for short), and the upper computer is set to NVIDIA TX2, which is loaded on the Turtlebot 2 The Kinect V2 camera on the device transmits video to the SLAM loopback detection system through the ROS system.

[0054] The SLAM loop detection method comprises the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of robot simultaneous positioning and map construction. The invention relates to the field of SLAM detection, in particular to a semantic information fusedvisual SLAM loopback detection method and device. The method comprises the steps of obtaining a video stream image, then offline extracting a key frame from the video stream image, detecting an object-level image in the key frame, extracting features of the object-level image, then matching the features of the object-level image, and carrying out loop-back detection on the key frame, and has invariance on illumination, so that accurate positioning and map construction of a robot are realized.

Description

technical field [0001] The invention relates to the technical field of robot simultaneous positioning and map construction, in particular to a visual SLAM loop detection method and device for fused semantic information. Background technique [0002] Since the emergence of bionics and intelligent robot technology, researchers have longed for a day when robots can observe and understand the surrounding world through eyes like humans, and can walk autonomously in the natural environment dexterously to achieve harmony between man and machine. Communion. [0003] Among them, an important and basic problem is how to analyze the three-dimensional structure of the scene through two-dimensional image information, and determine the position of the camera in it. The solution to this problem is inseparable from the research of a basic technology: Simultaneous-Localization and-Mapping (SLAM), especially the vision-based SLAM technology. [0004] In order to achieve the same effect as t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06F16/29
Inventor 吴俊君陈世浪周林邝辉宇
Owner FOSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products