Single-image-based global depth estimation method

A technology of depth estimation and single image, applied in the field of depth estimation, it can solve the problems of large amount of calculation, complex algorithm calculation process, discontinuous depth information, etc., and achieve the effect of small amount of calculation, simple calculation process, and accurate global depth map.

Active Publication Date: 2012-06-20
SHENZHEN GRADUATE SCHOOL TSINGHUA UNIV
View PDF2 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The depth map obtained by this method is assigned a depth value in units of pixel macroblocks. It still uses a part of the pixels in the image to estimate the global depth information. It also does not make full use of the information of each pixel in the image, and finally obtains the depth. The depth information contained in the map is not continuous, and it is not a complete global depth map in the true sense
At the same time, the calculation process of the algorithm involved in this method is relatively complicated, and the amount of calculation is relatively large.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Single-image-based global depth estimation method
  • Single-image-based global depth estimation method
  • Single-image-based global depth estimation method

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0012] Such as figure 1 As shown, it is a flow chart of the global depth estimation method in this specific embodiment, including the following steps:

[0013] U1) Perform Gaussian blur processing on the single original image to be processed to obtain a blurred image. Such as figure 1 As shown, in this embodiment, Gaussian blur processing is directly performed along one direction to obtain a blurred image. This direction can be set arbitrarily by the user.

[0014] U2) respectively convert the original image and the blurred image obtained after direct blurring from the RGB three-channel space to the single-channel gray value space.

[0015] In this specific embodiment, the conversion formula shown in formula 5 is used for grayscale conversion, and formula 5 is:

[0016] G(x,y)=0.11×R(x,y)+0.59×G’(x,y)+0.3×B(x,y);

[0017] Among them, G(x, y) represents the converted gray value of the pixel with coordinates (x, y) on the image, R(x, y), G'(x, y), B(x, y) Respectively repr...

specific Embodiment approach 2

[0039] The difference between this specific embodiment and Embodiment 1 is that in this specific embodiment, during Gaussian blur processing, Gaussian blur processing is performed along the x direction of the horizontal axis and the y direction of the vertical axis. When calculating the variance ratio, the variance in the x direction is weighted The ratio Rx and the variance ratio Ry in the y direction obtain the corresponding variance ratio R at each pixel in the original image.

[0040] Such as figure 2 As shown, it is a flow chart of the global depth estimation method in this specific embodiment, including the following steps:

[0041] W1) Perform Gaussian blur processing on the single original image to be processed to obtain a blurred image. Such as figure 2 As shown, in this specific embodiment, Gaussian blur processing is performed along the x direction of the horizontal axis and the y direction of the vertical axis to obtain two blurred images after blurring, which ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a single-image-based global depth estimation method, which comprises the following steps of: 1) performing Gaussian blurring on a single original image to be processed to obtain a blurred image; 2) converting the original image and the blurred image from a red, green and blue (RGB) three-channel space into a single-channel gray value space respectively; 3) acquiring a grayvariance value of each pixel in the original image and the blurred image according to gray values respectively; 4) calculating a variance ratio corresponding to each pixel in the original image according to the gray variance values; 5) performing normalization operation on a depth information value corresponding to each pixel in the original image, and performing conversion to obtain an initial global depth map of the original image; and 6) bilaterally filtering the initial global depth map obtained by the step 5) to obtain a final global depth map of the original image. By the single-image-based global depth estimation method, information in a global range can be effectively obtained, so that the accuracy of the obtained final global depth map can be ensured.

Description

technical field [0001] The invention relates to a depth estimation method in the field of computer vision, in particular to a global depth estimation method based on a single image. Background technique [0002] The depth estimation method is used to estimate the depth information of each pixel in the image to be processed, and obtain the global depth map of the image to be processed, which plays an important role in the application fields of computer vision and computer graphics. Some depth estimates depend on processing multiple defocused images, taking a set of images of the same scene under different camera parameters, and then applying various re-blurring processes to measure the blur parameters. However, these methods have some problems in practice, such as the occlusion problem and the requirement that the scene in the picture is still, which greatly limits their application range in actual scenes. [0003] There are also some methods to extract depth maps based on a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
Inventor 王好谦吴畏徐秀兵戴琼海
Owner SHENZHEN GRADUATE SCHOOL TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products