Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A point cloud semantic map construction method based on deep learning and lidar

A technology of lidar and semantic map, applied in neural architecture, 3D modeling, image data processing, etc., can solve the problems of inapplicability, cumbersome process, complicated calibration process, etc., and achieve accurate effect, strong robustness, and simple process Effect

Active Publication Date: 2021-06-25
GUANGZHOU WEIMOU MEDICAL INSTR CO LTD
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the process of using multiple sensors to construct static road networks and dynamic obstacle maps, it is necessary to perform a series of sensor calibration operations to fuse each sensor in advance. This calibration process is relatively complicated and has errors.
At the same time, GPS sensors cannot be used indoors; high-definition satellite images are difficult to obtain, so the process of building static road networks and obstacle maps with multiple sensors at the same time is cumbersome and not suitable for indoor scenarios and promotion
In the prior art, the semantic markup of the map is related to people, and each map needs to be manually marked separately, which is a large workload for the construction of large-scale semantic maps

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A point cloud semantic map construction method based on deep learning and lidar
  • A point cloud semantic map construction method based on deep learning and lidar
  • A point cloud semantic map construction method based on deep learning and lidar

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The accompanying drawings are for illustrative purposes only, and should not be construed as limitations on this patent; in order to better illustrate this embodiment, certain components in the accompanying drawings will be omitted, enlarged or reduced, and do not represent the size of the actual product; for those skilled in the art It is understandable that some well-known structures and descriptions thereof may be omitted in the drawings. The positional relationship described in the drawings is for illustrative purposes only, and should not be construed as a limitation on this patent.

[0035]The purpose of the present invention is to provide a point cloud semantic map construction method based on deep learning and laser radar. The present invention uses a lightweight deep convolutional network specially designed for the semantic segmentation of laser radar point clouds to carry out real-time semantic annotation on the preprocessed laser radar point cloud to obtain t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to the technical field of intelligent semantic maps, and more specifically, to a point cloud semantic map construction method based on deep learning and laser radar. The method includes: step 1: constructing a deep convolutional neural network; step 2: preprocessing the lidar point cloud; step 3: training the deep convolutional neural network model; step 4: inputting the lidar point cloud into the deep convolutional neural network model , to get the label of each point in the lidar point cloud, that is, the point cloud with semantic annotation; step 5: use the lidar point cloud with semantic annotation to construct a real-time point cloud map, and obtain the point cloud semantic map; step 6: After the point cloud semantic map is constructed, the semantic information in the point cloud map is globally optimized and corrected using voting based on sliding windows.

Description

technical field [0001] The present invention relates to the technical field of intelligent semantic maps, and more specifically, to a point cloud semantic map construction method based on deep learning and laser radar. Background technique [0002] Lidar point cloud maps play an important role in the perception and motion planning of autonomous robots (including but not limited to: autonomous vehicles, robots, unmanned aerial vehicles, etc.). There are many methods for generating laser point cloud maps based on lidar. LiDAR point cloud maps typically consist of large-scale, discrete points. Therefore, the lidar point cloud map lacking semantic information is more difficult for human eyes to identify, which is not conducive to manual post-processing. At the same time, due to the lack of semantic information in the map, it is difficult for engineers to formulate more complex motion planning for robots. At present, there have been researches on semantic segmentation of objec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G01S17/894G06N3/04G06T17/05
CPCG06T17/05G01S17/89G06N3/045
Inventor 黄凯张文权杨俊杰康德开李博洋朱笛
Owner GUANGZHOU WEIMOU MEDICAL INSTR CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products