Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for update step in video coding using motion compensated temporal filtering

A temporal filtering and motion compensation technology, applied in the field of video coding, can solve problems such as drift, and achieve the effect of simplifying the interpolation process and eliminating the need to update the motion vector derivation process

Inactive Publication Date: 2008-07-02
NOKIA CORP
View PDF0 Cites 84 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the original frame is only available at the encoder and not at the decoder, there may be drift in the prediction process utilizing the open-loop architecture

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for update step in video coding using motion compensated temporal filtering
  • Method and apparatus for update step in video coding using motion compensated temporal filtering
  • Method and apparatus for update step in video coding using motion compensated temporal filtering

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0073]According to one embodiment of the present invention, block diagrams for MCTF decomposition (or analysis) and MCTF synthesis (or synthesis) are shown in Fig. 7 and Fig. 8, respectively. The block diagrams of the encoder and decoder combined with the MCTF module are shown in Fig. 9 and Fig. 10, respectively. Since the prediction-step motion compensation process is required whether or not the MCTF technique is used, an additional module for the update-step motion compensation process combined with MCTF is also required. The sign inverters in Figures 7 and 8 are used to change the sign of the motion vector components in order to obtain the inverse of the motion vector.

[0074] Fig. 9 shows a block diagram of an MCTF-based encoder according to an embodiment of the present invention. The MCTF decomposition module includes both a prediction step and an update step. This module generates prediction residuals and some side information including block segmentation, reference f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a method and module for performing the update operation in motion compensated temporal filtering for video coding. The update operation is performed according to coding blocks in the prediction residue frame. Depending on macroblock mode in the prediction step, a coding block can have different sizes. Macroblock modes are used to specify how a macroblock is segmented into blocks. In the prediction step, the reverse direction of the motion vectors is used directly as an update motion vector and therefore no motion vector derivation process is performed. Motion vectors that significantly deviate from their neighboring motion vectors are considered not reliable and excluded from the update step. An adaptive filter is used in interpolating the prediction residue block for the update operation. The adaptive filter is an adaptive combination of a short filter and a long filter.

Description

technical field [0001] The present invention relates generally to video coding, and in particular to video coding using motion compensated temporal filtering. Background technique [0002] Digital video is compressed for storage and broadcasting purposes so that the resulting compressed video can be stored in a smaller space. [0003] A sequence of digital images, such as ordinary moving images recorded on film, including sequences of still images and with the illusion of motion produced by displaying the images one after the other at a relatively rapid frame rate, usually 15 to 30 frames per second. A common approach to compressing digital video is to exploit the redundancy (ie, temporal redundancy) between these successive images. In a typical video at a given moment, some moving objects are combined with slow camera movement or no camera movement, and successive images have similar content. It is beneficial to transmit only the differences between successive images. i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N7/26
CPCH04N19/00781H04N19/00278H04N19/00121H04N19/00818H04N7/364H04N19/00145H04N19/0063H04N19/00066H04N19/00896H04N19/00739H04N19/00787H04N19/00684H04N19/00072H04N19/00703H04N19/117H04N19/119H04N19/13H04N19/137H04N19/176H04N19/513H04N19/521H04N19/523H04N19/61H04N19/615H04N19/63H04N19/82
Inventor 王祥林M·卡克扎威克兹鲍亦亮J·里奇
Owner NOKIA CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products