Supercharge Your Innovation With Domain-Expert AI Agents!

Flight time depth image iterative optimization method based on convolutional neural network

A convolutional neural network and time-of-flight technology, applied in the field of 3D vision, can solve problems such as difficult packaging, deviation, and low reliability and accuracy, and achieve the effects of expanding application prospects, improving accuracy, and eliminating various errors

Active Publication Date: 2021-08-10
ZHEJIANG UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

One is that the end-to-end CNN architecture cannot realize the mapping from the multi-frequency correlation coefficient map to the real depth map in principle
The second is that ToF depth map optimization needs to run on millimeter precision. The results completely output by CNN have low reliability and accuracy in the prediction of pixel-level millimeter precision, and there are prone to failure cases and serious deviations.
The third is that the error sources of the ToF depth map are different, so different principles need to be used to remove them. It is difficult for an end-to-end network to encapsulate these principles and predict the nonlinear coupling error accurately at one time.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Flight time depth image iterative optimization method based on convolutional neural network
  • Flight time depth image iterative optimization method based on convolutional neural network
  • Flight time depth image iterative optimization method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0083] The present invention will be described in further detail below in conjunction with the accompanying drawings and specific embodiments.

[0084] Such as figure 1 As shown, the iterative optimization method of the time-of-flight depth image based on the convolutional neural network of the present invention comprises the following steps:

[0085] Step 1: If figure 2 As shown, the correlation coefficient map obtained by ToF camera imaging is obtained by using basic triangular transformation and multi-frequency phase deblurring algorithm to obtain the initial depth map and reflection intensity map.

[0086] The working mode of the ToF camera is dual-frequency four-sampling. The ToF camera emits amplitude-modulated continuous waves of two different frequencies, and performs basic triangular transformation and multi-frequency phase deblurring algorithm on the amplitude-modulated continuous waves of each frequency to obtain an initial Depth map and a reflection intensity ma...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an iterative optimization method of a time of flight (Time-of-Flight: ToF) depth image based on a convolutional neural network. The method comprises the following steps: performing basic triangular transformation and a multi-frequency phase deblurring algorithm on a correlation coefficient graph obtained by imaging of a multi-frequency amplitude modulation continuous wave ToF camera to obtain an initial depth graph and a reflection intensity graph of a scene; constructing a convolutional neural network based on iterative optimization, and constructing a data set by using computer graphics and a three-dimensional reconstruction technology to train the neural network and search for optimal parameters for the neural network; inputting ToF original correlation measurement, an initial depth map and a reflection intensity map into the convolutional neural network, and gradually reducing error influences of various sources and different characteristics through iterative optimization of a multi-stage isomorphic network, so the depth map quality is improved from coarse to fine.

Description

technical field [0001] The invention relates to the field of three-dimensional vision, in particular to an iterative optimization method of a Time-of-Flight depth image based on a convolutional neural network. Background technique [0002] Depth acquisition is not only the key to most 3D vision tasks, but also plays an increasingly important role in traditional RGB-based vision tasks, such as semantic segmentation and gesture recognition. Previously popular structured light and stereo vision either had a too small ranging range or required scene textures. ToF technology overcomes these shortcomings and becomes the most promising way to acquire depth. [0003] ToF cameras measure distance by measuring the time it takes for a beam or pulse to travel from the transmitter to the object and back to the receiver. For certain amplitude-modulated CW ToF cameras, depth is obtained indirectly by measuring the phase difference between the outgoing and incoming waves. Modern ToF camer...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T5/00G06N3/04G06N3/08
CPCG06N3/08G06T2207/20016G06T2207/20081G06T2207/20084G06T2207/10028G06T2207/10024G06T2207/10016G06N3/045G06T5/73Y02T10/40
Inventor 李东晓郑卓林张明唐啸天
Owner ZHEJIANG UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More