Three-dimensional laser radar navigation method and equipment in glass scene based on deep learning

A three-dimensional laser and radar navigation technology, applied in the field of mobile robot navigation, can solve problems such as matching laser radar data with maps, inability to obtain glass information, and glass perception confusion

Active Publication Date: 2020-11-24
HUAZHONG UNIV OF SCI & TECH
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But normal distribution transforms have difficulty matching lidar data to maps in glass environments due to perceptual confusion caused by glass
However, existing localization systems cannot efficiently obtain glass information from voxel maps, which degrades localization performance.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional laser radar navigation method and equipment in glass scene based on deep learning
  • Three-dimensional laser radar navigation method and equipment in glass scene based on deep learning
  • Three-dimensional laser radar navigation method and equipment in glass scene based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0084] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0085] figure 1 The implementation flow of the entire navigation algorithm in the present invention is shown: in the offline training stage, it is necessary to build and train the deep neural network for glass recognition and the optical characteristic deep neural network, while in the online navigation stage, it is necessary to build a spatial voxel map under the glass environment, and real-tim...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional laser radar navigation method and equipment in a glass scene based on deep learning. The navigation method comprises the following steps: performing offlinetraining to obtain a glass recognition deep neural network and an optical characteristic deep neural network; in the navigation process, inputting the glass probability recognized by the glass recognition deep neural network and the calculated glass direction into a space voxel map in real time; in a space voxel map containing the glass probability and the glass normal vector, simulating a laserbeam emitted by the laser radar by adopting a light projection method, obtaining the probability that the laser beam transmits the glass based on an optical characteristic deep neural network to screen radar points, and generating a simulated point cloud set; and finally, based on the real three-dimensional radar data and the simulation point cloud set, calculating the absolute pose of a robot inthe space voxel map by adopting normal distribution transformation. According to the navigation method, perception confusion based on laser radar positioning can be avoided, and the three-dimensionalnavigation precision of the mobile robot in a glass scene is improved.

Description

technical field [0001] The invention belongs to the field of mobile robot navigation, and relates to a deep learning-based three-dimensional laser radar navigation method and equipment in a glass scene, and more specifically, to a deep neural network-based mobile robot three-dimensional laser radar navigation method in a glass scene. Background technique [0002] LiDAR-based navigation is one of the most reliable and accurate navigation methods to obtain robot pose. Recently, glass walls have gradually become the main elements for mobile robots navigating buildings such as museums or shopping malls. However, unlike opaque objects, glass has reflective properties that can lead to blurry or erroneous lidar perception processes. Therefore, it is difficult to apply lidar-based localization to scenes with glass walls. In order to improve the positioning accuracy of the robot under various environmental conditions, it is necessary to develop a reliable lidar-based positioning me...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G01S17/931G01S17/46G06N3/04G06N3/08
CPCG01C21/20G01S17/931G01S17/46G06N3/08G06N3/045
Inventor 孟杰王书亭谢远龙蒋立泉吴天豪孙浩东李鹏程林鑫刘伦洪吴昊
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products