Point cloud semantic map construction method based on deep learning and laser radar

A lidar and semantic map technology, applied in 3D modeling, neural architecture, image data processing, etc., can solve problems such as cumbersome process, complicated calibration process, and inapplicability, and achieve strong robustness, simple process, and simple process Effect

Active Publication Date: 2018-08-17
GUANGZHOU WEIMOU MEDICAL INSTR CO LTD
View PDF6 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the process of using multiple sensors to construct static road networks and dynamic obstacle maps, it is necessary to perform a series of sensor calibration operations to fuse each sensor in advance. This calibration process is relatively complicated and has errors.
At the same time, GPS sensors cannot be used indoors; high-definition satellite images are difficult to obtain, so the process of building static road networks and obstacle maps with multiple sensors at the same time is cumbersome and not suitable for indoor scenarios and promotion
In the prior art, the semantic markup of the map is related to people, and each map needs to be manually marked separately, which is a large workload for the construction of large-scale semantic maps

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Point cloud semantic map construction method based on deep learning and laser radar
  • Point cloud semantic map construction method based on deep learning and laser radar
  • Point cloud semantic map construction method based on deep learning and laser radar

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The accompanying drawings are for illustrative purposes only, and should not be construed as limitations on this patent; in order to better illustrate this embodiment, certain components in the accompanying drawings will be omitted, enlarged or reduced, and do not represent the size of the actual product; for those skilled in the art It is understandable that some well-known structures and descriptions thereof may be omitted in the drawings. The positional relationship described in the drawings is for illustrative purposes only, and should not be construed as a limitation on this patent.

[0035]The purpose of the present invention is to provide a point cloud semantic map construction method based on deep learning and laser radar. The present invention uses a lightweight deep convolutional network specially designed for the semantic segmentation of laser radar point clouds to carry out real-time semantic annotation on the preprocessed laser radar point cloud to obtain t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of intelligent semantic maps, more particularly to a point cloud semantic map construction method based on deep learning and a laser radar. The method comprises: step one, constructing a deep convolutional neural network; step two, carrying out laser radar point cloud preprocessing; step three, training a deep convolutional neural network model; step four, inputting a laser radar point cloud into the deep convolutional neural network model to obtain a tag of each point in the laser radar point cloud being one including a semantic annotation; step five, carrying out real-time point cloud map construction by using the laser radar point cloud including the semantic annotation so as to obtain a point cloud semantic map; and step six, after construction of the point cloud semantic map, carrying out global optimization and correction on the semantic information in the point cloud map by using voting based on a sliding window.

Description

technical field [0001] The present invention relates to the technical field of intelligent semantic maps, and more specifically, to a point cloud semantic map construction method based on deep learning and laser radar. Background technique [0002] Lidar point cloud maps play an important role in the perception and motion planning of autonomous robots (including but not limited to: autonomous vehicles, robots, unmanned aerial vehicles, etc.). There are many methods for generating laser point cloud maps based on lidar. LiDAR point cloud maps typically consist of large-scale, discrete points. Therefore, the lidar point cloud map lacking semantic information is more difficult for human eyes to identify, which is not conducive to manual post-processing. At the same time, due to the lack of semantic information in the map, it is difficult for engineers to formulate more complex motion planning for robots. At present, there have been researches on semantic segmentation of objec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01S17/89G06N3/04G06T17/05
CPCG06T17/05G01S17/89G06N3/045
Inventor 黄凯张文权杨俊杰康德开李博洋朱笛
Owner GUANGZHOU WEIMOU MEDICAL INSTR CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products