Depth map treatment method and device

A processing method and depth map technology, applied in the field of depth map processing methods and devices, can solve the problems of inaccurate edges, unstable depth values, and the inability of depth maps to reflect the distance between objects in the scene more realistically, and achieve object outlines precise effect

Inactive Publication Date: 2010-01-20
TSINGHUA UNIV
View PDF0 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Depth maps obtained through existing technologies usually have many defects, such as unstable depth val

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth map treatment method and device
  • Depth map treatment method and device
  • Depth map treatment method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] see figure 1 , Embodiment 1 of the present invention provides a method for processing a depth map, the method comprising:

[0054] 101: Extract the object outline in the original depth image;

[0055] Specifically, the extracted object contour can be expressed in the form of a sequence of contour points, that is, discrete point sampling is performed on the extracted object contour to obtain N contour points v i , i={1, 2, ..., N}, the intervals between these contour points are not uniform. For example, for contours in horizontal, vertical or diagonal directions, only the contour points at both ends are reserved.

[0056] see figure 2 , figure 2 (a) is the original image of one of the frame images in the planar video sequence, figure 2 (b) for figure 2 (a) The original depth map of the corresponding original image, image 3 Contour 1 in is the directly extracted contour of the person in the original depth map. Obviously, the directly extracted contour 1 is the...

Embodiment 2

[0120] see Figure 7 , Embodiment 2 of the present invention provides a depth map processing device, the device includes: an extraction module 201, a correction module 202, a filling module 203 and a first filtering module 204;

[0121] The extraction module 201 is used to extract the object outline in the original depth map;

[0122] Specifically, the extraction module 201 may include: an extraction unit and a sampling unit;

[0123] The extraction unit is used to extract the object outline in the original depth map;

[0124] The sampling unit is used to perform discrete point sampling on the object contour extracted by the extraction unit to obtain N contour points v i, i={1, 2, ..., N}, the intervals between these contour points are not uniform. For contours in horizontal, vertical or diagonal directions, only the contour points at both ends are kept.

[0125] see figure 2 , figure 2 (a) is the original image of one of the frame images in the planar video sequence, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a depth map treatment method and a device, which belong to the field of computer multimedia technology. The method comprises the steps of extracting a profile of an object in a depth map, amending the profile of the object, filling the inner region and the outer region of the profile after amending with original depth values of the regions and carrying out Gaussian filtering on the well filled depth map. The device comprises an extracting module, an amending module, a filling module and a first filtering module. The technical scheme provided by the embodiment of the invention leads to the profile of the depth map to be precise and leads the transition of the depth values of the depth map at the profile of the object to be smooth by amending the profile of the depth map and filtering the depth map, thereby greatly reducing jitter of a synthesized three-dimensional video.

Description

technical field [0001] The invention relates to the technical field of computer multimedia, in particular to a processing method and device for a depth map. Background technique [0002] Stereoscopic video is a new type of video technology that can provide a three-dimensional sense. It can enable users to see scenes that are almost identical to the real world through video, and produce a huge sense of reality and presence. Therefore, it has become the development direction of future multimedia technology. Currently, the commonly used stereoscopic video architecture is as follows: two videos are transmitted simultaneously, one of which is a planar video sequence to be converted, and the other is a corresponding depth map sequence, which contains the depth information of each pixel in each frame. Through DIBR (Depth-Image-Based Rendering, based on depth map rendering) technology, it is possible to obtain virtual perspectives of real-world scenes in one or more directions, and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00H04N13/00
Inventor 戴琼海晏希曹汛季向阳
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products