Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-description video coding method based on transformation and data fusion

A technology of data fusion and video coding, applied in the field of video coding, can solve different problems

Inactive Publication Date: 2005-06-29
HISENSE
View PDF0 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It is worth noting that the first property is fundamentally different from conventional scalable (or layerable) coding. The base layer encoded by conventional methods is crucial. If the base layer is lost, the remaining other bitstreams will be useless

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-description video coding method based on transformation and data fusion
  • Multi-description video coding method based on transformation and data fusion
  • Multi-description video coding method based on transformation and data fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0025] Example 1. A multi-description video coding method based on transformation and data fusion, such as figure 2 shown. It includes the following steps:

[0026] ① Implement transformation 1 to transformation n (2) on the signal to be encoded (1);

[0027] ② Perform quantization and entropy coding on the signals transformed from 1 to n (2) respectively (3);

[0028] ③ Decoding (5) the quantized and entropy encoded (3) signals 1 to n according to their respective paths 1 to n (4);

[0029] ④ Inversely transform the decoded signals 1 to n respectively (6);

[0030] ⑤ Obtain side descriptions 1~n respectively after inverse transformation, and fuse (7) the 1~n data after inverse transformation (6) into a central description.

[0031] Each description is a code stream conforming to the MPEG video standard, and the position order of the I frames described by each side is exactly the same, and multiple description coding based on transformation and data fusion is applied to ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A multi-description video encode method combining transformation and data includes applying transformation to being encoded signals and quantizing and entropy encoding the transformed signals to be decoded in terms of independent paths then to inversely transform the decoded signals to get the edge descriptions and merging the inversely transformed data to the central description, which can combines the multi-description encodes merging transformation and data with video codes. The encoding method can generate multiple MPEG code streams to one set of video sequence, and a video sequence of great distortion can be recovered from any stream, when multiple code streams are collected, a video sequence of small distortion will be recovered.

Description

technical field [0001] The invention belongs to the technical field of video coding, and more specifically relates to the design of a multi-description video coding method based on transformation and data fusion. Background technique [0002] Multiple description coding (multiple description code) encodes a single signal into two (or more than two) independent bit streams, and these independent bit streams are called descriptors (multiple descriptor). Multiple description coding has two main properties: 1. Each descriptor can be independently decoded and reconstructed into a usable original signal: 2. There is complementary information in multiple descriptors, and as the number of correctly received descriptors The quality of the decoded signal is also gradually improved. It is worth noting that the first property is fundamentally different from conventional scalable (or layerable) coding. The base layer encoded by conventional methods is crucial...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N19/625H04N19/91
Inventor 田树民
Owner HISENSE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products