Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Full convolution neural network (FCN)-based monocular image depth estimation method

A convolutional neural network and image depth technology, applied in the field of monocular image depth estimation, can solve the problems of low resolution and insufficient precision of the resulting image, reduce the amount of parameters, optimize the structure, and improve the low resolution of the output image Effect

Active Publication Date: 2018-01-12
NANJING UNIV OF POSTS & TELECOMM
View PDF4 Cites 106 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] The technical problem to be solved by the present invention is to overcome the deficiencies of the prior art, provide a monocular image depth estimation method based on the full convolutional neural network FCN, and solve the problem of relatively low resolution of the resulting image in the existing monocular image depth estimation method. Low leads to insufficient precision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Full convolution neural network (FCN)-based monocular image depth estimation method
  • Full convolution neural network (FCN)-based monocular image depth estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] Embodiments of the present invention will be described below in conjunction with the accompanying drawings.

[0026] Such as figure 1 As shown, the present invention designs a monocular image depth estimation method based on a fully convolutional neural network FCN, and trains an end-to-end prediction and estimation system based on a fully convolutional neural network, eliminating the need for post-processing steps in traditional methods , which is simpler and more practical. This method specifically comprises the following steps:

[0027] Step 1. Obtain training image data.

[0028] Since the network layer is relatively deep and the number of parameters required for training is large, the amount of training data that needs to be prepared needs to meet a certain order of magnitude requirement. In the indoor scene, the NYU Depth V2 data set is used as a basis to generate the final required training data. Use the original image data of 249 scenes in the 464 indoor sce...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a full convolution neural network (FCN)-based monocular image depth estimation method. The method comprises the steps of acquiring training image data; inputting the training image data into a full convolution neural network (FCN), and sequentially outputting through pooling layers to obtain a characteristic image; subjecting each characteristic image outputted by a last pooling layer sequentially to amplification treatment to obtain a new characteristic image the same with the dimension of a characteristic image outputted by a previous pooling layer, and fusing the twocharacteristic images; sequentially fusing the outputted characteristic image of each pooling layer from back to front so as to obtain a final prediction depth image; training the parameters of the full convolution neural network (FCN) by utilizing a random gradient descent method (SGD) during training; acquiring an RGB image required for depth prediction, and inputting the RGB image into the well trained full convolution neural network (FCN) so as to obtain a corresponding prediction depth image. According to the method, the problem that the resolution of an output image is low in the convolution process can be solved. By adopting the form of the full convolution neural network, a full-connection layer is removed. The number of parameters in the network is effectively reduced.

Description

technical field [0001] The invention relates to a monocular image depth estimation method based on a fully convolutional neural network (FCN), and belongs to the technical field of three-dimensional image reconstruction of computer vision. Background technique [0002] Recovering 3D depth information from 2D images is an important problem in the field of computer vision and an essential part of understanding scene geometry. Image depth information has important applications in robotics, scene understanding, 3D reconstruction, etc. The acquisition of image depth information aims to obtain the spatial position information between different objects in the image. Currently, there are two main ways to obtain image depth information. One is to directly obtain depth information through hardware devices, such as Kinect. Another widely used method is to use single or multiple RGB image sequences of the same scene for depth estimation, including multi-view, binocular and single-vie...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/55G06K9/62G06N3/04
Inventor 朱沛贤霍智勇
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products