Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video affine motion estimation method of adaptive factors

An adaptive factor, affine motion technology, applied in the direction of digital video signal modification, electrical components, image communication, etc., can solve problems such as inability to achieve motion estimation/compensation, limited practicality, etc.

Active Publication Date: 2019-04-16
LIAONING NORMAL UNIVERSITY
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the solution process of the above three methods all involves a large number of interpolation operations with sub-pixel precision, and its computational complexity is even much higher than that of the full search of the block translation model, which cannot realize real-time motion estimation / compensation, which largely limits its practicality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video affine motion estimation method of adaptive factors
  • Video affine motion estimation method of adaptive factors

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The video affine motion estimation method of adaptive factor of the present invention is characterized in that carrying out according to the following steps:

[0057] Step 1. If all frames of the current Group of Picture (GOP) have been processed, the algorithm ends; otherwise, select an unprocessed frame in the current GOP as the current frame , and use its previous frame as the reference frame ;

[0058] Step 2. If the current frame All the macroblocks of have been processed, then go to step 1; otherwise, select an unprocessed macroblock of the current frame As the current macroblock, let its size be pixel, , Indicates the abscissa and ordinate of the pixel in the upper left corner of the current macroblock, is a preset constant, in this embodiment, let ;

[0059] Step 3. According to the definition of formula (1), use the diamond search method in the size of In the window of pixels, calculate the current macroblock The translational motion vector...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video affine motion estimation method of adaptive factors. The method comprises the following steps: judging a zoom factor of a current to-be-predicted macro block by using alinear weighting prediction error corresponding to a translational motion vector and 2D weighting autocorrelation of a reference frame; and secondly, keeping the zoom factor unchanged, expressing anaffine motion compensation error as a quadratic function about a rotation angle, calculating an adaptive rotation angle under the affine motion through the Vieta theorem, and then obtaining an affinemotion vector of the current to-be-predicted macro block. By adoption of the video affine motion estimation method disclosed by the invention, the "violent" parameter search or iterative solution of the traditional affine motion estimation method is avoided, the optimal zoom factor and rotation angle can be directly calculated, and the number of sub-pixel interpolation operations is significantlyreduced, thereby improving the compensation quality of the traditional block matching motion estimation method while ensuring the instantaneity.

Description

technical field [0001] The invention relates to the field of video coding and compression, in particular to a video affine motion estimation method with fast operation speed, high motion compensation quality and the ability to effectively predict the adaptive factor of the affine motion existing in the video. Background technique [0002] Motion estimation is an effective temporal prediction technique, and most of the improvements in video coding efficiency over the years have been due to better motion estimation algorithms. However, the computing resources consumed by the motion estimation link often account for more than 50% of the total resources of the encoder, and even up to 80%. In order to achieve a better compromise between complexity and prediction accuracy, existing video coding standards generally use block matching motion estimation algorithms based on translation models, and have successively proposed a variety of fast block matching strategies, such as downsamp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N19/51H04N19/176H04N19/523H04N19/177H04N19/59
CPCH04N19/176H04N19/177H04N19/51H04N19/523H04N19/59
Inventor 宋传鸣闫小红葛明博王相海
Owner LIAONING NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products