Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Semantic SLAM robustness improvement method based on instance segmentation

A robust and semantic technology, applied in the field of dynamic scene semantic SLAM improvement, can solve the problems of low accuracy and large amount of calculation

Active Publication Date: 2020-08-25
SOUTH CHINA UNIV OF TECH
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The feature-based method can only create a sparse point cloud map without environmental semantic information. Although the direct method can create a semi-dense or dense map, it is accurate to directly use the optical flow to calculate the entire image, especially when the image size is large. disadvantages such as low precision and large amount of calculation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semantic SLAM robustness improvement method based on instance segmentation
  • Semantic SLAM robustness improvement method based on instance segmentation
  • Semantic SLAM robustness improvement method based on instance segmentation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0052] A method for improving robustness of semantic SLAM based on instance segmentation provided by an embodiment of the present invention includes the following steps:

[0053] (1) Use the RGB-D camera to obtain the RGB color image and the depth image, and the whole method uses three branches for parallel processing. Use branch three to process the depth image, and branch one and branch two to process the RGB color image, in which branch one divides the collected RGB color image data samples into training set, verification set and The test set, Branch 2 uses all RGB color image data for ORB feature extraction and assists optical flow calculation;

[0054] (2) In branch one, for the processing of the RGB color image, the training set after the division of the RGB color image is sent to the instance segmentation convolutional neural network training and the final network model parameters are obtained after verifying the effect on the verification set. The instance segmentatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a semantic SLAM robustness improvement method based on instance segmentation, and the method comprises the steps: firstly carrying out the instance segmentation of a key framethrough an instance segmentation network, and building prior semantic information; calculating a feature point optical flow field to further distinguish the object, identifying a real moving object in the scene, and removing the feature points belonging to the dynamic object; and finally, performing semantic association, and establishing a semantic map without dynamic object interference. Compared with the prior art, the semantic map is established by adopting a method of combining deep learning and optical flow, and the depth map is added on the basis of the color map, so that the system isendowed with the capability of establishing the dense three-dimensional point cloud semantic map. In addition, a Mask-RCNN framework is adopted for real-time semantic segmentation, and object dynamicinformation can be calculated through mutual combination of dynamic feature points estimated by optical flow information and pixel-level semantic information. According to the method, deep learning and optical flow are mutually combined, so that the robustness of the whole system is remarkably improved, and the method can be applied to real-time semantic map construction in a dynamic scene.

Description

technical field [0001] The invention relates to the field of instance segmentation and semantic SLAM technology, in particular to an improved method for dynamic scene semantic SLAM based on deep learning. Background technique [0002] Simultaneous localization and mapping (SLAM) refers to the process of a robot estimating its own pose and constructing an environmental map only through its own sensors in an unfamiliar environment. It is a prerequisite for many robot application scenarios, such as path planning. , collision-free navigation, environmental perception, etc., and visual SLAM refers to the positioning and mapping method with the camera as the main data acquisition sensor. The picture has rich texture information, which can generate the semantic information of the environment by combining with deep learning. [0003] The feature-based method can only create a sparse point cloud map without environmental semantic information. Although the direct method can create a ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/29G06T7/11G06T7/13G06T7/215G06T7/33G06T7/73G06N3/04G06N3/08
CPCG06F16/29G06T7/11G06T7/13G06T7/215G06T7/33G06T7/73G06N3/08G06T2207/10024G06N3/045Y02P90/30
Inventor 陈安向石方刘海明吴忻生陈纯玉王博
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products