Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for selecting reference field and acquiring time-domain motion vector

A motion vector and motion vector prediction technology, applied in television, electrical components, digital video signal modification and other directions, can solve the problem of large motion vector prediction error, and achieve the effect of eliminating jumping, achieving significant effect and improving subjective evaluation score.

Active Publication Date: 2012-05-09
GUANGZHOU KUVISION DIGITAL TECH COMPANY
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, it is necessary to provide a method for selecting a reference field and obtaining a time-domain motion vector for the problem of large motion vector prediction errors in the direct and skip modes of AVS video encoder B-frame bottom field coding

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for selecting reference field and acquiring time-domain motion vector
  • Method for selecting reference field and acquiring time-domain motion vector
  • Method for selecting reference field and acquiring time-domain motion vector

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The purpose of the present invention is to provide a method that can effectively avoid the larger motion vector prediction error in the direct and skip modes of the bottom field coding of the B frame of the AVS video encoder. In the case of keeping the objective quality unchanged, the coding bit rate can be effectively reduced, and the subjective quality can be obviously improved.

[0048] In order to achieve the above purpose, the present invention provides a method for obtaining time-domain motion vectors.

[0049] The specific implementation steps are as follows:

[0050] If the current macroblock type is B_Skip or B_Direct_16*16, or the current subblock type is SB_Direct_8*8, perform the following operations for each 8*8 block:

[0051] first step:

[0052] (1) If the coded macroblock type of the sample corresponding to the sample position in the upper left corner of the current 8*8 block in the backward reference image is "I_8*8", then the forward and backward r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method for selecting a reference field and a time-domain motion vector in a jump or direct mode. The method comprises the steps of: firstly, judging a current macro-block type and a current subblock type, and then performing the following operations according to different types: assigning values to different numerical values of a current block according to the image structure of the current block; defining opposite-field perpendicular vector offset compensation; defining block distance formulas between a backward reference block and self / forward / backward reference blocks; assigning a value to mvRef_y; and calculating the forward / backward motion vator of the current block according to the formula and the assigned value. The backward motion vector mvBw of the current block can be calculated by using the method in such a way. For a typical interlaced video sequence, when the direct and jump modes are more suitable, the method disclosed by the invention can be used for increasing the coding efficiency by 10% and 20% totally with a very obvious effect. In addition, the method disclosed by the invention can be used for eliminating serious jump distortion for low field videos, and the subjective evaluation scoring for a code stream of original AVS (Audio Video Standard) reference grade field coding can be greatly increased.

Description

technical field [0001] The present invention relates to a method for selecting a reference field and time-domain motion vector prediction in skip or direct mode. Background technique [0002] AVS1-P2 is the video part of the advanced video and audio codec standard approved in February 2006 in my country. In video coding, the coding of interlaced scanning signals usually adopts a different coding mode from that of progressive scanning to improve coding efficiency. These interlaced coding modes are generally classified into macroblock-based interlaced coding modes and field-based interlaced coding modes. AVS1-P2 provides a field-based interlaced encoding mode. [0003] In video coding, B frames can obtain higher coding efficiency than P frames by using front and back reference frames. Direct and skip modes in B-frames in particular can provide very high compression ratios, but their effectiveness depends directly on the accuracy of motion vector prediction. Generally, moti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N7/26H04N7/32H04N7/36H04N19/176H04N19/513
Inventor 曾志华
Owner GUANGZHOU KUVISION DIGITAL TECH COMPANY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products