Visual SLAM method based on optical flow and semantic segmentation

A semantic segmentation and vision technology, applied in character and pattern recognition, instruments, biological neural network models, etc., can solve problems such as poor tracking and positioning effects of SLAM

Inactive Publication Date: 2020-10-20
WUHAN UNIV
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The embodiment of the present application provides a visual SLAM method based on optical flow and semantic segmentation, which solves the problem of poor tracking and positioning effect of SLAM in a dynamic environment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual SLAM method based on optical flow and semantic segmentation
  • Visual SLAM method based on optical flow and semantic segmentation
  • Visual SLAM method based on optical flow and semantic segmentation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0064] This embodiment provides a visual SLAM method based on optical flow and semantic segmentation, which mainly includes the following steps:

[0065] Step 1. Use the semantic segmentation network to segment the input image information to obtain static regions and predict dynamic regions.

[0066] Step 2. Perform feature tracking on the static region and the predicted dynamic region by using the sparse optical flow method.

[0067] Step 3. Determine the type of the feature points in the input image information, and remove the dynamic feature points.

[0068] Step 4. The set of removed motion feature points is used as tracking data, input into ORB-SLAM for processing, and the pose result is output.

[0069] In order to better understand the above-mentioned technical solution, the above-mentioned technical solution will be described in detail below in conjunction with the accompanying drawings and specific implementation methods.

[0070] This embodiment provides a visual S...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of visual space positioning, and discloses a visual SLAM method based on optical flow and semantic segmentation, and the method comprises the steps: carrying out the segmentation of input image information through employing a semantic segmentation network to obtain a static region and a predictive dynamic region; performing feature tracking on the static region and the predictive dynamic region by adopting a sparse optical flow method; determining the types of the feature points in the input image information, and removing the dynamic feature points; and taking the set without the motion feature points as tracking data, inputting the tracking data into ORB-SLAM for processing, and outputting a pose result. According to the method, the problem ofpoor SLAM tracking and positioning effect in the dynamic environment is solved, and the trajectory information with high pose precision in the dynamic environment can be obtained.

Description

technical field [0001] The invention relates to the technical field of visual space positioning, in particular to a visual SLAM method based on optical flow and semantic segmentation. Background technique [0002] SLAM is a key technology in the field of intelligent mobile robots. Visual SLAM uses a camera as the main sensor. Compared with other types of sensors, the camera can provide more information, so it has been widely studied in recent years. However, achieving accurate tracking and localization in dynamic scenes has always been a major challenge for SLAM systems. [0003] In actual scenes, dynamic objects will cause wrong data when calculating camera motion, resulting in tracking failure or wrong tracking. Several methods have been proposed to solve this problem, one method is the traditional robustness estimation method - RANSAC. This method will judge the dynamic information as outliers and remove them, and retain the static information to ensure the success of t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/62G06N3/04G01C11/02
CPCG06V20/42G06V10/267G06V10/757G06N3/045G06F18/214
Inventor 姚剑卓胜德程军豪龚烨涂静敏
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products