Unlock instant, AI-driven research and patent intelligence for your innovation.

Two-dimensional single-view image depth estimation method based on DCT coefficient entropy

A technology of image depth and coefficient entropy, applied in image communication, image analysis, image data processing and other directions, can solve the problem of low accuracy, and achieve the effect of high accuracy

Inactive Publication Date: 2014-02-05
HARBIN UNIV OF COMMERCE
View PDF2 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The present invention aims to solve the problem of low accuracy of the existing two-dimensional single-view image depth estimation method, thereby providing a two-dimensional single-view image depth estimation method based on DCT coefficient entropy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-dimensional single-view image depth estimation method based on DCT coefficient entropy
  • Two-dimensional single-view image depth estimation method based on DCT coefficient entropy
  • Two-dimensional single-view image depth estimation method based on DCT coefficient entropy

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0024] The specific embodiment one, the two-dimensional single-view image depth estimation method based on DCT coefficient entropy, it is realized by the following steps:

[0025] Step 1. For each pixel (i, j) in the image to be processed, a window of N×N size is selected as a sub-image centered on the pixel; N is a positive integer; i and j are both positive integers; then the The sub-image is subjected to type-II DCT transformation;

[0026] Step 2, set the quantization step size, then quantize the sub-image DCT coefficients, then calculate its coefficient entropy, and use this entropy as a measure of the blur degree of the pixel point (i, j);

[0027] Step 3. Use the method of step 1 and step 2 to traverse each pixel in the image to obtain the wavelet coefficient entropy corresponding to each pixel, and then map the entropy value to the 8-bit depth value range through linear mapping to obtain pixel-level The depth map is used to complete the two-dimensional single-view ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a two-dimensional single-view image depth estimation method based on DCT coefficient entropy, and relates to a two-dimensional single-view image depth estimation method. The two-dimension single-view image depth estimation method based on the DCT coefficient entropy solves the problem that an existing two-dimension single-view image depth estimation method is low in accuracy. The method includes the steps that step 1, for each pixel (i, j) in an image to be processed, a window with the size of N*N is selected as a sub-image with the pixel as the center, wherein N is a positive integer, and both i and j are positive integers; step 2, DCT change is carried out on the sub-image; the quantifying step size is set, quantifying is carried out on sub-image DCT coefficients, then, the coefficient entropy of the sub-image is computed, and the entropy is used as the measurement of the fuzzy degree of the pixel (i, j); traversal is carried out on each pixel point in the image through the step 1 and the step 2, the wavelet coefficient entropy corresponding to each pixel point is obtained, then, the entropy is mapped to a 8bit depth range through linear mapping so as to obtain a depth map of a pixel level, and therefore two-dimensional single-view image depth estimation based on the DCT coefficient entropy is completed. The method is suitable for two-dimensional single-view image depth estimation.

Description

technical field [0001] The invention relates to a method for estimating the depth of a two-dimensional single-view image. Background technique [0002] Three-dimensional display is an important form of image information in the future. Compared with two-dimensional images, three-dimensional images have the characteristics of clear layers, bright colors, long dwell time and deep impression. The amount of information carried by three-dimensional images is far more than that of two-dimensional images. It has strong visual impact and high artistic appreciation value, which can make the audience have a stronger visual experience. [0003] With the emergence of 3D displays, consumers are faced with a serious problem, namely the scarcity of current 3D media resources. Since 3D display has just entered the stage of popularization, the 3D media resources that people watch at present are obtained by special stereo shooting equipment, or carefully produced by 3D studios. Therefore, t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00H04N13/00
Inventor 孙华东金雪松赵志杰潘庆和牛连丁陈铭张立志范智鹏
Owner HARBIN UNIV OF COMMERCE
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More