Semantic SLAM method based on GMS feature matching in dynamic scene

A feature matching and dynamic scene technology, applied in image data processing, instrumentation, computing, etc., can solve the problems of low detection accuracy and robustness of SLAM methods, and achieve the effect of improving accuracy and robustness.

Pending Publication Date: 2021-02-19
GUANGDONG POWER GRID CORP ZHAOQING POWER SUPPLY BUREAU
View PDF10 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to overcome the problems of low detection accuracy and robustness of the SLAM method in the dynamic scene in the above-mentioned prior art, the present invention provides a semantic SLAM method based on GMS feature matching in the dynamic scene, which removes dynamic SLAM by combining motion consistency and semantic information. way, improving the accuracy and robustness of visual SLAM systems operating in dynamic environments

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semantic SLAM method based on GMS feature matching in dynamic scene
  • Semantic SLAM method based on GMS feature matching in dynamic scene
  • Semantic SLAM method based on GMS feature matching in dynamic scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0062] Such as figure 1 Shown is an embodiment of a semantic SLAM method based on GMS feature matching under a dynamic scene, including the following steps:

[0063] Step 1: Calibrate the camera to remove image distortion; acquire and input the environment image; calibrate the camera to remove image distortion. The specific steps are:

[0064] S1.1: First obtain the internal parameters of the camera, where the internal parameters include f x ,f y ,c x ,c y , normalize the three-dimensional coordinates (X, Y, Z) to homogeneous coordinates (x, y);

[0065] S1.2: Remove the influence of distortion on the image, where [k 1 ,k 2 ,k 3 ,p 1 ,p 2 ] is the distortion coefficient of the lens, r is the distance from the point to the origin of the coordinate system:

[0066]

[0067] S1.3: Transfer the coordinates in the camera coordinate system to the pixel coordinate system:

[0068]

[0069] Step 2: Segment the input image through the semantic segmentation network, obt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a semantic SLAM method based on GMS feature matching in a dynamic scene, and the method comprises the steps of carrying out the segmentation of an input image through a semantic segmentation network, obtaining the masks of all objects, removing the mask part of a dynamic object, and obtaining a preliminary image after the dynamic object is removed; extracting ORB feature points from the input image, and then calculating a descriptor; detecting and removing dynamic feature points according to a method of combining motion consistency and semantic information; through themode of removing the dynamic state by combining the motion consistency and the semantic information, improving the operation precision and robustness of the visual SLAM system in a high-dynamic environment.

Description

technical field [0001] The invention relates to the field of vision-based positioning and navigation in the autonomous inspection of UAVs, and more specifically, to a semantic SLAM method based on GMS feature matching in dynamic scenes. Background technique [0002] In the process of UAV intelligent inspection, the UAV needs to determine the next operation independently according to the real-time information of the current environment. Therefore, real-time positioning and working environment mapping of UAVs are important links in the process of UAV intelligent inspection. Especially in the collaborative work of multi-UAVs arranged in a grid, the environment detected by each UAV is a dynamic scene (including moving objects that sometimes disappear), so it is necessary to Special algorithms need to be developed for dynamic scenarios. [0003] Simultaneous Localization and Mapping (SLAM) is a method that can estimate the current position and attitude by the corresponding moti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/11G06T7/246G06T7/73G06T7/80
CPCG06T7/11G06T7/246G06T7/73G06T7/80G06T2207/10028G06T2207/20016
Inventor 陈政游林辉胡峰孙仝张谨立宋海龙黄达文王伟光梁铭聪黄志就何彧陈景尚谭子毅潘嘉琪李志鹏罗鲜林
Owner GUANGDONG POWER GRID CORP ZHAOQING POWER SUPPLY BUREAU
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products