Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method of motion vector derivation for video coding

A motion vector and motion information technology, which is applied in the field of video coding and can solve problems such as difficult actual motion and inability to capture complex motion.

Active Publication Date: 2017-03-22
HFI INNOVATION INC
View PDF11 Cites 52 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in a typical video sequence, such translational motion hardly accurately describes the complex actual motion in the content
Therefore, block-based motion estimation cannot capture complex motions such as rotation, scaling, and deformation of moving objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method of motion vector derivation for video coding
  • Method of motion vector derivation for video coding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] The following description is of the best contemplated mode of carrying out the invention. This description is to illustrate the general principles of the invention and should not be construed as limiting the invention. The scope of the invention can best be determined by reference to the appended claims.

[0020] In order to provide an improved motion description for a block, according to an embodiment of the present invention, a current block (eg: PU in HEVC) is partitioned into multiple sub-blocks (eg, sub-PUs). An MV (also referred to as "derived sub-block's MV") is derived from the motion model for each sub-block. For example, each PU can be partitioned into multiple sub-PUs, and the MV for each sub-PU is derived from the motion model function F(x, y), where (x, y) is the position of the sub-PU, and F is A function representing a motion model.

[0021] In one embodiment, the derived sub-block motion vector MV_cur for a sub-PU is derived by an affine motion model:...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method and apparatus for deriving a sub-block motion vector for the current sub-block based on a motion-model function depending on the current sub-block location are disclosed. The derived sub-block motion vector is then used for encoding or decoding the sub-block. The motion-model function may correspond to an affine motion-model function or a bilinear motion-model function. In one embodiment, a new Merge mode can be used to apply prediction of a current block by applying prediction on the sub-block basis using the sub-block motion vector derived from the motion-model function. In another embodiment, an additional inter prediction mode can be used to apply prediction of a current block by applying prediction on the sub-block basis using the sub-block motion vector derived from the motion-model function.

Description

[0001] 【CROSS-REFERENCE TO RELATED APPLICATIONS】 [0002] The present invention claims the priority right of the PCT patent application with the serial number PCT / CN2014 / 082523 filed on July 18, 2014. This PCT patent application is incorporated herein by reference. [0003] 【Technical field】 [0004] The present invention relates to video coding. In particular, the present invention relates to motion vector derivation of sub-blocks based on motion-model functions for video coding. [0005] 【Background technique】 [0006] Motion estimation is an efficient inter-frame coding technique that exploits temporal redundancy in video sequences. Motion compensated interframe coding has been widely used in various international video coding standards. Motion estimation employed in various coding standards is generally a block-based technique, where motion information (eg, coding mode and motion vectors) is determined for each macroblock or similar block configuration. Furthermore, in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N19/176
CPCH04N19/46H04N19/96H04N19/527H04N19/537H04N19/52H04N19/513H04N19/70
Inventor 黄晗
Owner HFI INNOVATION INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products