Supercharge Your Innovation With Domain-Expert AI Agents!

Adaptive multi-sensor data fusion method and system based on mutual information

A multi-sensor and data fusion technology, applied in image data processing, instruments, biological neural network models, etc., can solve problems such as lack of quantitative tools, insufficient data adaptability, and adjustment of different sensors

Active Publication Date: 2021-01-26
TSINGHUA UNIV
View PDF9 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, they are either not adaptive enough to the data or computationally complex
The introduction of mutual information overcomes some difficulties in the existing methods: 1) lack of reasonable quantification tools and support of mathematical methods; 2) the problem of network non-convergence is easy to exist in the training process; 3) fusion weights cannot be adaptively Adjustments are made to inputs from different sensors, i.e. cannot be adjusted in terms of data quality and informativeness

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive multi-sensor data fusion method and system based on mutual information
  • Adaptive multi-sensor data fusion method and system based on mutual information
  • Adaptive multi-sensor data fusion method and system based on mutual information

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0100] Such as figure 1 As shown, Embodiment 1 of the present invention provides an adaptive multi-sensor data fusion method based on mutual information, taking RGB images and laser radar point cloud data as examples, specifically including the following steps:

[0101] Step 1) Obtain the RGB image and lidar point cloud of the same road surface; specifically include:

[0102] Step 101) obtain the RGB image G by the vehicle-mounted camera;

[0103] The road surface image information is collected by a forward-facing monocular camera or a forward-facing monocular camera installed on a moving vehicle. The forward-facing monocular camera collects road surface image information directly in front of the driving direction of the vehicle and above the road surface. That is, the collected road surface image information is a perspective view corresponding to information directly in front of the driving direction of the collected vehicle and above the road surface.

[0104] Step 102) O...

Embodiment 2

[0148] Such as Figure 4 As shown, Embodiment 2 of the present invention provides an adaptive multi-sensor data fusion method based on mutual information for time series data information, taking RGB images and laser radar point cloud data as examples, specifically including steps:

[0149] Step 1) Obtain the RGB image and lidar point cloud of the same road surface;

[0150] The RGB image is required to be sequential, that is, the previous frame image has a certain continuity with the current frame image, and the expected features are consistent. We believe that the normalized mutual information between any two consecutive frames of images is greater than 0.5, and the images are temporal. LiDAR point cloud data requirements are consistent.

[0151] All the other are consistent with embodiment 1.

[0152] Step 2) is consistent with embodiment 1;

[0153] Step 3) Extract expected feature Y;

[0154] For time-series data, if it is assumed that the fusion network has a good in...

Embodiment 3

[0162] Based on the methods of Embodiment 1 and Embodiment 2, Embodiment 3 of the present invention proposes an adaptive multi-sensor data fusion system based on mutual information, the system includes: a camera and a laser radar installed on a car, preprocessing module, result output module, time series data adaptive adjustment module and fusion network; wherein,

[0163] The camera is used to collect RGB images of the road surface;

[0164] The laser radar is used to synchronously collect the point cloud data of the road surface;

[0165] The preprocessing module is used to preprocess point cloud data to obtain dense point cloud data;

[0166] The result output module is used to input RGB images and dense point cloud data into a pre-established and trained fusion network, and output data fusion results;

[0167] The fusion network is used to calculate the mutual information of the feature tensor of the input data and the expected feature, assign the fusion weight of the in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an adaptive multi-sensor data fusion method based on mutual information. The method comprises the following steps: receiving an RGB image of a road surface acquired by a camera; receiving point cloud data of the road surface synchronously acquired by a laser radar; preprocessing the point cloud data to obtain dense point cloud data; inputting the RGB image and the dense point cloud data into a pre-established and trained fusion network, and outputting a data fusion result, wherein the fusion network is used for calculating mutual information of the feature tensor and the expected feature of the input data, allocating a fusion weight of the input data according to the mutual information, and outputting a data fusion result according to the fusion weight. According tothe method, an information theory tool of mutual information is introduced, the correlation between the extracted feature of the input data and the expected feature of the fusion network is calculated, the quality of the data quality and the information amount can be reasonably and objectively quantified, a strict mathematical method is used as a theoretical support, and certain interpretabilityis achieved.

Description

technical field [0001] The invention belongs to the technical field of intelligent driving, and in particular relates to an adaptive multi-sensor data fusion method and system based on mutual information. Background technique [0002] Intelligent driving vehicles need to sense and measure the surrounding environment through a variety of sensors to ensure safe driving. The fusion network can fuse data from different sensors to complete driving tasks, including lane line detection, object detection and tracking, etc. However, there may be various problems with the data from different sensors. For example, the camera may have problems such as image overexposure, the influence of rain and fog, and obstacles, and the lidar may have problems such as too sparse point cloud and a long distance from the target. In response to these problems, mutual information calculations can be used to evaluate data quality, and effective information can be weighted by assigning fusion weights to...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/02
CPCG06N3/02G06V20/588G06V10/56G06F18/253G06F18/214G06T5/50G06T2207/10024G06T2207/10028G06T2207/20076G06T2207/20221G06V20/58G06V10/811G06V10/82G06V10/454G06T7/521G06F18/2321
Inventor 张新钰李骏李志伟邹镇洪杜昊宇张天雷
Owner TSINGHUA UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More