Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Visual SLAM method based on semantic optical flow and inverse depth filtering

A deep filtering and semantic technology, which is applied in the direction of character and pattern recognition, 2D image generation, and extraction from basic elements, etc., can solve the problem that the visual positioning system is susceptible to interference, achieve good performance, excellent precision, and improve calculation accuracy Effect

Active Publication Date: 2020-06-19
BEIHANG UNIV
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The problem solved by the technology of the present invention is: to overcome the deficiencies of the existing technology, aiming at the problem that the visual positioning system of the system is susceptible to interference under dynamic scene conditions, a visual SLAM method based on semantic optical flow and inverse depth filtering is provided to improve the SLAM system to cope with dynamic scenes ability, improve the system's ability to understand the scene, and improve the system's positioning accuracy in dynamic scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual SLAM method based on semantic optical flow and inverse depth filtering
  • Visual SLAM method based on semantic optical flow and inverse depth filtering
  • Visual SLAM method based on semantic optical flow and inverse depth filtering

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only part of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0035] like figure 1 Shown, the concrete realization steps of the present invention are as follows:

[0036] Step 1. The image data collected by the sensor will be obtained, image feature points will be extracted, and the RGB image of the current frame will be semantically segmented using the SegNet semantic segmentation network. Feature points are classified into static, latent dynamic and dynamic categories by semantic information. Among them, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a visual SLAM method based on semantic optical flow and inverse depth filtering, and the method comprises the following steps: (1) a visual sensor collects an image, carries out the feature extraction and semantic segmentation of the collected image, and obtains extracted feature points and a semantic segmentation result; and (2) according to the feature points and a segmentation result, performing map initialization by using a semantic optical flow method, removing dynamic feature points, and creating a reliable initialization map; and (3) evaluating whether 3D map points in the initialized map are dynamic points or not by adopting an inverse depth filter, and expanding the map according to an evaluation result of the inverse depth filter. And (4) continuing to perform tracking, local mapping and loopback detection in sequence for the map expanded by the depth filter, and finally realizing dynamic scene-oriented visual SLAM based on semantic optical flow and inverse depth filtering.

Description

technical field [0001] The present invention relates to a visual SLAM method based on semantic optical flow and inverse depth filtering, which is a new visual SLAM method that combines semantic optical flow and inverse depth filtering technology, and is suitable for solving the problem of traditional visual SLAM systems in high dynamic scenes Failure and lack of understanding of the scene. Background technique [0002] Simultaneous Localization and Mapping (SLAM) refers to the estimation of the pose of the robot itself through the acquired sensor data without the prior information of the environment, and at the same time constructing a globally consistent environment map. Among them, the SLAM system based on visual sensors is called visual SLAM. Because of its low hardware cost, high positioning accuracy, and the advantages of completely autonomous positioning and navigation, this technology has attracted wide attention in the fields of artificial intelligence and virtual re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T11/20G06K9/34
CPCG06T11/206G06V10/26
Inventor 崔林艳马朝伟郭政航
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products