Semantic SLAM system and method based on joint constraints

A semantic and semantic map technology, applied in the field of computer vision, can solve the problems of unable to construct, unable to calculate camera pose, inaccurate calculation of camera pose, etc., to achieve the effect of improving accuracy

Active Publication Date: 2019-12-03
XIDIAN UNIV
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to overcome the deficiencies of the above-mentioned prior art, and propose a semantic SLAM system and method based on joint constraints, which are used to solve the inaccurate calculation of the camera pose when the pixel depth value is unstable and when the dynamic target occupies the camera. The problem that the camera pose cannot be calculated when viewing most of the space, in order to improve the accuracy of the camera pose and the integrity of the camera trajectory, and at the same time solve the problem that objects with motion properties cannot be constructed in the point cloud map when they are stationary, and obtain more Accurate point cloud map

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semantic SLAM system and method based on joint constraints
  • Semantic SLAM system and method based on joint constraints
  • Semantic SLAM system and method based on joint constraints

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0046] refer to figure 1 , the semantic SLAM system based on the joint constraint of the present invention includes a data acquisition module, a neural network module, a joint constraint module, a data fusion module, a visual front-end module, a back-end optimization module and a semantic map module, wherein:

[0047] The data acquisition module adopts a depth camera and is used to collect multiple frames of depth images and color images of the indoor environment to obtain depth image sequences and color image sequences;

[0048] The neural network module is used to perform forward propagation processing on the color image sequence frame by frame by training the BlitzNet network model, so as to obtain the detection image with the potential dynamic target frame and the instance segmentation image with the potential dynamic target insta...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a semantic SLAM system and method based on joint constraints. The objective of the invention is to solve the problems of inaccurate camera pose calculation under the condition of unstable pixel depth values and incapability of calculating the camera pose when a dynamic target occupies most of the space of a camera field of view, improve the accuracy of camera pose estimationthrough a depth constraint method, and improve the integrity of a camera track through an epipolar constraint method. The implementation method comprises the steps that a data acquisition module acquires an image sequence; a neural network module acquires a detection image and an instance segmentation image; a joint constraint module acquires different feature point category sets; a data fusion module acquires a static target instance segmentation image and a dynamic target instance segmentation image;a visual front-end module acquires the pose of the depth camera and a road sign point set ina three-dimensional space; a rear-end optimization module acquires a globally optimal depth camera pose and a road sign point; and a semantic map module acquires a semantic point cloud map.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and further relates to a semantic SLAM system and method based on joint constraints, which can be used for pose estimation of cameras and construction of semantic maps in complex and highly dynamic environments. Background technique [0002] At the same time, the positioning and map construction system SLAM plays an important role in the autonomous navigation and obstacle avoidance of the unmanned system. In the past three decades, the SLAM system has developed rapidly, and its main goal is the autonomous exploration of the unknown environment by the unmanned system. In the process of positioning itself accurately, it can build a map of the environment. However, the maps built by traditional SLAM systems only contain low-level geometric features such as points, lines, and surfaces in the environment. For future unmanned systems, maps that only contain simple spatial information are diffic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/73G06K9/00G06K9/34
CPCG06T7/73G06T2207/10028G06T2207/10024G06T2207/20081G06V20/40G06V20/10G06V10/267Y02T10/40
Inventor 韩红王毅飞张齐驰唐裕亮迟勇欣范迎春
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products