Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-modal information-based light field depth estimation method

A depth estimation and multi-modal technology, applied in the direction of neural learning methods, computing, biological neural network models, etc., can solve the problems of low accuracy of depth maps, can not capture cross-modal complementarity well, and achieve good application The scene, the effect of making up for the loss of details and complete information

Active Publication Date: 2021-05-07
DALIAN UNIV OF TECH
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, simple late fusion does not capture the complementarity between cross-modalities well
These problems make the predicted depth map less accurate, and there is a lot of room for improvement in some challenging scenarios, which is a problem that needs to be focused on for depth estimation based on focus stacks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal information-based light field depth estimation method
  • Multi-modal information-based light field depth estimation method
  • Multi-modal information-based light field depth estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] In order to enable those skilled in the art to better understand the solutions of the present invention, the following will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is an embodiment of a part of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0051] It should be noted that the terms "first" and "second" in the description and claims of the present invention and the above drawings are used to distinguish similar objects, but not necessarily used to describe a specific sequence or sequence. It is to be understood that the data so used are interchangeable under appropriate ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-modal information-based light field depth estimation method, which comprises the following steps of: acquiring light field image data by using a light field camera based on a micro-lens array to obtain a four-dimensional light field image array, extracting a centermost visual angle image as a central view, exporting a group of focus slices as a focus stack, and carrying out data expansion; constructing a convolutional neural network, taking the focus stack and the corresponding center view as the input of a network model, and obtaining the input tensor of a focus stack flow and the input tensor of a center view flow; training the constructed convolutional neural network; and testing on the light field test set by using the trained neural network, and carrying out verifying on an actual focusing slice acquired by the mobile phone. According to the light field depth estimation method provided by the invention, multi-mode information of the light field can be fully utilized, and more accurate depth estimation is realized on a light field data set; the obtained depth information is more complete, and the edge is clearer; and practical application of a common consumer-level mobile phone terminal can be achieved.

Description

technical field [0001] The present invention relates to the technical field of light field depth estimation, in particular to a method for estimating light field depth based on multimodal information. Background technique [0002] Depth estimation is a key issue in the process of 3D reconstruction, and its purpose is to obtain the distance information between the target object and the photographer. The depth information of the scene can help people better understand the geometric structure of the scene, and at the same time provide data support for other visual tasks, and has important applications in the fields of scene restoration, action recognition and saliency detection. Therefore, depth estimation has become a hot research problem in computer vision. [0003] Common depth estimation methods usually extract scene depth information from single or multiple 2D images captured by conventional cameras. However, since the imaging process of traditional cameras only consider...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/50G06N3/04G06N3/08
CPCG06T7/50G06N3/084G06T2207/10052G06T2207/20081G06T2207/20084G06T2207/20221G06T2207/20192G06N3/044
Inventor 朴永日张淼吉新新张玉坤
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products