Multi-sensor fusion sensing method and system for automatic driving under extreme working condition

A multi-sensor fusion and automatic driving technology, which is applied in the direction of instruments, character and pattern recognition, computer components, etc., can solve problems such as fault tolerance or robustness of fusion systems that are not well resolved

Active Publication Date: 2021-05-14
NAT UNIV OF DEFENSE TECH
View PDF11 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, a unified fusion theory and an effective generalized fusion model and algorithm have not yet been established, and the problem of fault tolerance or robustness in the fusion system has not been well solved. There are still many practical problems in the design of the data fusion system.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensor fusion sensing method and system for automatic driving under extreme working condition
  • Multi-sensor fusion sensing method and system for automatic driving under extreme working condition
  • Multi-sensor fusion sensing method and system for automatic driving under extreme working condition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not intended to limit the present application.

[0049] In one embodiment, such as figure 1 As shown, a multi-sensor fusion perception method for autonomous driving under extreme working conditions is provided, including the following steps:

[0050] Step 102, acquire the smoke and dust area in the preset image data, and acquire the corresponding laser point cloud data within the range of the three-dimensional viewing cone of the image data.

[0051] Since the visible light camera has rich texture information, it can initially locate the smoke and dust area based on the apparent information in the acquired image data, b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an automatic driving multi-sensor fusion sensing method and system under an extreme working condition. The method comprises the following steps: obtaining a smoke fog area in image data, obtaining corresponding laser point cloud data in a three-dimensional view cone range of the image data, projecting the laser point cloud data to an image domain corresponding to the image data, and obtaining smoke fog point cloud data of the image domain according to an intersection of the two; meanwhile, obtaining radar domain obstacle point cloud data based on the laser point cloud data. And according to the intersection of the smoke mist point cloud data of the image domain and the obstacle point cloud data of the radar domain, false obstacle fusion sensing data are obtained, and then the false obstacle fusion sensing data is combined with real obstacle data obtained by the millimeter wave radar to remove false obstacles. According to the method, a visual angle where smoke fog possibly exists is provided for the laser point cloud based on the image data, the laser point cloud data are mutually verified between the image domain and the radar domain, the laser point cloud data are combined with the millimeter wave radar, and finally high-precision automatic driving scene data are obtained.

Description

technical field [0001] The present application relates to the technical field of automatic driving decision-making and planning, in particular to a multi-sensor fusion sensing method and system for automatic driving under extreme working conditions. Background technique [0002] Current self-driving vehicles usually make decisions and plans in a three-dimensional coordinate system, which requires the environment perception system to provide accurate geometric ranging information. Therefore, lidar has become the mainstream sensor for current self-driving vehicles due to its high-precision geometric measurement characteristics. Due to the short wavelength of lidar, it is difficult to penetrate granular obstacles such as smoke and fog, so it is easy to cause false obstacles in the environmental perception system and then fail, affecting the safety of autonomous vehicles. Compared with lidar, visible light camera has rich texture information, but it is difficult to give precise ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00
CPCG06V20/56G06F18/23G06F18/25
Inventor 方强呼晓畅孙毅徐昕张兴龙
Owner NAT UNIV OF DEFENSE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products