Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth map encoding method and apparatus thereof, and depth map decoding method and apparatus thereof

a depth map and encoding technology, applied in the field of encoding and decoding of video data, can solve the problem that the depth map frame decoding apparatus may not unnecessarily perform the operation of obtaining differential information, and achieve the effect of efficiently de encoding a depth map imag

Active Publication Date: 2016-03-10
SAMSUNG ELECTRONICS CO LTD
View PDF1 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patent focuses on methods and devices for creating and viewing 3D videos. In particular, it describes methods and devices for efficiently translating depth map images into digital video to achieve efficient 3D video production.

Problems solved by technology

A depth map frame decoding apparatus may perform an operation of unnecessarily obtaining differential information from a bitstream even though the differential information is unused.
Thus, the depth map frame decoding apparatus may not unnecessarily perform an operation of obtaining the differential information.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth map encoding method and apparatus thereof, and depth map decoding method and apparatus thereof
  • Depth map encoding method and apparatus thereof, and depth map decoding method and apparatus thereof
  • Depth map encoding method and apparatus thereof, and depth map decoding method and apparatus thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048]An optimally encoded prediction mode is selected based on cost. In this regard, differential information may include information indicating a difference between a representative value of a partition corresponding to an original depth map and a representative value of a partition predicted from neighboring blocks of a current prediction unit. The differential information may include delta constant partition value (CPV)(Delta DC). The delta CPV(Delta DC) means a difference between a DC value of the original depth map partition and a DC value of the predicted partition. For example, the DC value of the original depth map partition may be an average of depth values of blocks included in the partition. The DC value of the predicted partition may be an average of depth values of neighboring blocks of the partition or an average of depth values of partial neighboring blocks of the partition.

[0049]A depth map frame encoder 16 intra prediction decodes the current prediction unit by usi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Disclosed is a depth map frame decoding method including reconstructing a color frame obtained from a bitstream based on encoding information of the color frame; splitting a largest coding unit of a depth map frame obtained from the bitstream into one or more coding units based on split information of the depth map frame; splitting the one or more coding units into one or more prediction units for prediction decoding; determining whether to split a current prediction unit into at least one partition and decode the current prediction unit by obtaining information indicating whether to split the current prediction unit into the at least one or more partitions from the bitstream; if it is determined that the current prediction unit is to be decoded by being split into the at least one or more partitions, obtaining prediction information of the one or more prediction units from the bitstream and determining whether to decode the current prediction unit by using differential information indicating a difference between a depth value of the at least one or more partitions corresponding to an original depth map frame and a depth value of the at least one or more partitions predicted from neighboring blocks of the current prediction unit; and decoding the current prediction unit by using the differential information based on whether to split the current prediction unit into the at least one or more partitions and whether to use the differential information.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application is a U.S. national stage application under 35 U.S.C. §371 of International Application No. PCT / KR2014 / 003010, filed on Apr. 7, 2014, in the Korean Intellectual Property Office, which claims priority from U.S. Provisional Application No. 61 / 808,876, filed on Apr. 5, 2013, in the United States Patent and Trademark Office, the disclosures of which are incorporated herein by reference in their entireties.1. FIELD[0002]Methods and apparatuses consistent with exemplary embodiments relate to encoding and decoding of video data including a depth image.2. DESCRIPTION OF RELATED ART[0003]As hardware for reproducing and storing high resolution or high quality video content has been developed, there is an increasing need for a video codec capable of effectively encoding or decoding the high resolution or high quality video content. According to a conventional video codec, video is encoded according to a limited encoding method based ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): H04N19/593H04N19/172H04N19/176H04N19/186H04N19/44H04N19/597
CPCH04N19/593H04N19/597H04N19/172H04N19/176H04N19/186H04N19/44H04N19/119H04N19/70
Inventor LEE, JIN-YOUNGPARK, MIN-WOOCHOI, BYEONG-DOOWEY, HO-CHEONYOON, JAE-WONCHO, YONG-JIN
Owner SAMSUNG ELECTRONICS CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products