Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sensor fusion depth reconstruction data driving method based on attention mechanism

A technology for reconstructing data and driving methods, applied in image data processing, instruments, image analysis, etc., can solve problems such as inability to meet application real-time requirements, and achieve the effect of improving data fusion quality, improving image visual quality, and reducing manual intervention.

Pending Publication Date: 2022-01-28
NANJING UNIV OF SCI & TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The first method and the second method use linear SPAD detectors and single-point SPAD detectors respectively, and need to scan and image the scene. It often takes several hours to obtain the depth information of a scene, which cannot meet the needs of more and more applications. Real-time requirements; the third method is mainly aimed at the problem of depth super-resolution, which needs to select important features from the original data of the detector in advance, that is, denoise the original data first, and then input it into the network, which has more preprocessing steps, and the network Upsampling only

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sensor fusion depth reconstruction data driving method based on attention mechanism
  • Sensor fusion depth reconstruction data driving method based on attention mechanism
  • Sensor fusion depth reconstruction data driving method based on attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] A sensor fusion depth reconstruction data-driven method based on attention mechanism, suitable for photon counting 3D imaging lidar system, the specific steps are as follows:

[0032] The first step is to construct a sensor fusion network based on attention mechanism and train it;

[0033]In a further embodiment, the sensor fusion network based on the attention mechanism includes a feature extraction module and a fusion reconstruction module, the main function of which is to process the input low-resolution noisy depth data under the guidance of high-resolution intensity information. Denoising and upsampling. The feature extraction module is used to extract multi-scale features in SPAD measurement data and intensity data, so that the network can learn rich hierarchical features of different scales, and better adapt to fine and large-scale upsampling; the feature extraction module obtains multi-scale features The intensity features and depth features of the correspondin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sensor fusion depth reconstruction data driving method based on an attention mechanism. According to the method, a convolutional neural network structure is designed for an SPAD array detector with a resolution of 32 * 32, and a low-resolution TCSPC histogram is mapped to a high-resolution depth map under the guidance of an intensity map; and a network adopts a multi-scale method to extract input features and fuses depth data and intensity data based on an attention model. In addition, a loss function combination is designed and is suitable for a network for processing TCSPC histogram data. According to the method, a spatial resolution of depth original data can be successfully improved by four times, the depth reconstruction effect of the method is verified on simulation data and collected data, and the method is superior to other algorithms in quality and data indexes.

Description

technical field [0001] The invention belongs to data-driven technology, in particular to a sensor fusion depth reconstruction data-driven method based on an attention mechanism. [0002] technical background [0003] Inferring correct depth information from perceived scenes is critical for many applications, such as autonomous driving, virtual reality, augmented reality, and robotics. Lidar is the leading technology in depth imaging. At present, most Lidar systems adopt a single-point / scanning method, using coaxially aligned laser diodes and single-photon detectors. The laser emits laser light, and the detector time mark is reflected by the scene. The coming back arrives at the photon. Although the scanning lidar system can obtain relatively accurate depth information, the acquisition speed is slow, and it usually takes several hours to complete a scene information acquisition. However, more and more applications require fast scene acquisition. Under this demand, single pho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/521G06T7/55G06T7/30G06T7/13G06T5/00G06T5/40G06T5/50G06N3/04
CPCG06T7/521G06T7/55G06T7/30G06T7/13G06T5/40G06T5/50G06T2207/20221G06N3/045G06T5/70
Inventor 何伟基蒋筱朵张闻文陈钱邹燕
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products