Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth map recovery method

A recovery method and a depth map technology, applied in image communication, electrical components, stereo systems, etc., can solve problems such as slow convergence of computational complexity and unsatisfactory results, and achieve low signal-to-noise ratio, simplified methods, and improved quality

Active Publication Date: 2016-06-08
SHENZHEN INST OF FUTURE MEDIA TECH +1
View PDF9 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

There are methods such as MRF, IMLS, Edge, and JGF in the depth restoration technology, but the recovery effect of using a single technology is not ideal
[0004] In order to restore the defective depth map obtained from the depth sensor, most of the current depth map restoration schemes based on filters are used, but the computational complexity converges slowly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth map recovery method
  • Depth map recovery method
  • Depth map recovery method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0009] In this embodiment, a training set is established for the depth map, and the parameter structure of the training convolutional neural network is used, so that the CNN can classify the degraded depth map. The method of kernel decomposition is used to initialize the hidden layer in the CNN structure, so that the CNN structure has the characteristics of deconvolution, which plays the role of denoising and filtering while classifying, and partially solves the degradation problem of the depth map. The AR model is established, and the parameters of the AR model are adjusted according to the main degradation models. Connect the output layer of the CNN with the input layer of the AR model, and input the corresponding output result of the CNN into the AR model.

[0010] A depth map restoration method based on a convolutional neural network and an autoregressive model proposed in this embodiment includes the following steps:

[0011] A1: The training set consists of a large numb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth map recovery method, comprising the following steps of A1, constituting a training set by the depth maps of a large number of various objects; A2, establishing a convolutional neural network (CNN), by using a nuclear separation method, acquiring the parameter of a hidden layer, establishing a convolutional network structure, and training the network structure and adjusting the network weight by using the depth maps in the training set; A3, in the output layer of the CNN, establishing an auto-regression model aiming at a possible result, and establishing an evaluation index; and A4, inputting an original depth map acquired by a depth sensor into the CNN, after denoising and classifying, recovering by an AR model, and if not conforming with requirements, inputting the result map into A2 until the high-quality depth map is acquired or the circulation is ended. According to the depth map recovery method, the image with low resolution and low signal to noise ratio acquired from the depth sensor can be recovered by using the depth convolution network. By using the depth map recovery method, the quality of the depth map can be significantly improved, and meanwhile the method for acquiring the depth map is also simplified.

Description

technical field [0001] The invention relates to the fields of computer vision technology and image processing, in particular to a depth map restoration method. technical background [0002] The technology belongs to the field of computer vision technology and image processing. The depth map is the depth information obtained from the actual shooting scene, and it plays a vital role in increasing the sense of reality, performing 3D reconstruction and 3D TV applications. Now the acquisition of the depth map is divided into two forms, one is passive and the other is active. This technology is mainly aimed at the depth map actively obtained by the depth sensor. However, the depth maps obtained by mainstream active depth sensors (such as ToF cameras, Kinect, etc.) have shortcomings such as low pixels and low signal-to-noise ratio. The depth map has pretty big holes. These problems have caused inconvenience to the application of the depth map. [0003] In order to obtain high-...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N13/00H04N13/02
CPCH04N13/128H04N13/275
Inventor 张永兵沈涛王兴政王好谦李莉华戴琼海
Owner SHENZHEN INST OF FUTURE MEDIA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products