Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion estimation block matching method based on H.265 video coding

A technology of motion estimation and video coding, applied in the direction of digital video signal modification, electrical components, image communication, etc., can solve time-consuming problems, and achieve the effect of reducing computing time, reducing coding time, image quality and transmission bit rate

Active Publication Date: 2017-02-22
HARBIN INST OF TECH
View PDF7 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, motion estimation is also one of the most complex and time-consuming parts of the H.265 video coding standard

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion estimation block matching method based on H.265 video coding
  • Motion estimation block matching method based on H.265 video coding
  • Motion estimation block matching method based on H.265 video coding

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0032] Specific implementation mode one: combine Figure 9 Describe this embodiment, the specific process of a kind of motion estimation block matching method based on H.265 video coding in this embodiment is:

[0033] like figure 1 Shown is a schematic diagram of the primary selection.

[0034] Step 1, preliminary selection stage: according to the division characteristics of the inter prediction unit, select the corresponding down-sampling scheme, and select the candidate matching group from all matching blocks according to the set adaptive threshold;

[0035] Step 2, selection stage: select the candidate matching groups obtained in the preliminary selection stage based on the rate-distortion optimization criterion, select the final matching block, and complete the block matching process in motion estimation.

[0036] In the first step, the number of matching pixels can be effectively reduced through primary selection, and an appropriate matching block can be selected. In t...

specific Embodiment approach 2

[0037] Specific embodiment 2: The difference between this embodiment and specific embodiment 1 is: the primary selection stage in the first step: according to the division characteristics of inter prediction units, select the corresponding down-sampling scheme, and select from all The candidate matching group is selected from the matching block; the specific process is:

[0038] Step 11, the division mode of the inter prediction unit is as follows Figure 2a , 2b , 2c, 2d, 2e, 2f, 2g, and 2h, there are eight kinds of prediction unit divisions in the coding standard, and the shapes are divided into squares and rectangles;

[0039] The downsampling templates for prediction blocks of different shapes are as image 3 , Figure 4 As shown, for a square prediction unit, to determine whether the square is a 32x32 and 64x64 prediction unit, the downsampling is as follows Figure 5 As shown in , if yes, use the down-sampling scheme of the rice character; if not, use the down-sampli...

specific Embodiment approach 3

[0045] Embodiment 3: The difference between this embodiment and Embodiment 1 or 2 is that the selection process of the down-sampling template in Step 1 and 2 is as follows Figure 5 , select the appropriate template from there. Calculate the absolute value of the single pixel difference between the previous frame (time t) and the current frame (time t+1) in the prediction unit. The specific process is to use the following formula:

[0046] SAD_c=abs(piOrg[i]-piCur[i])

[0047] Among them, piOrg represents the pixel in the current frame, piCur represents the pixel in the previous frame, abs is the absolute value symbol, and SAD_c is the single pixel difference between the previous frame (t time) and the current frame (t+1 time) in the prediction unit The absolute value of the value, i is the i-th pixel in the prediction unit.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a motion estimation block matching method based on H.265 video coding, relates to the motion estimation block matching method and aims at reducing the computing complexity of motion estimation in an H.265 video standard coding process and reducing the coding time. The motion estimation block matching method based on the H.265 video coding comprises the specific process of S1, an initial selection stage of selecting a corresponding downsampling scheme according to a division feature of an interframe prediction unit, and selecting candidate matching groups from all matching blocks according to a set threshold value; and S2, a fine selection stage of carrying out rate-distortion optimization criterion-based fine selection on the candidate matching groups obtained in the initial selection stage, thereby selecting the final matching blocks and finishing the block matching process in motion estimation. The method is applicable to the field of motion estimation block matching based on video coding.

Description

technical field [0001] The present invention relates to a motion estimation block matching method. Background technique [0002] In recent years, with the popularization of smart mobile terminals, video applications have become more and more diverse, and the video data volume has grown at an alarming rate. The amount of high-definition image data is huge. In order to meet the needs of new video applications, ITU-T cooperated with ISO / IEC and released a new generation of high-efficiency video coding standard H.265 in 2013. H.265 adopts the traditional hybrid video coding framework, and has carried out technical innovations on each module of the framework, including supporting more intra-frame and inter-frame prediction modes, combining transformation and quantization, sample point adaptive compensation, CABAC entropy coding, etc. . These new technologies enable H.265 to save about 50% bit rate compared with H.264 / AVC under the same encoding quality. H.265 brings great comp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N19/147H04N19/182H04N19/51H04N19/567
CPCH04N19/147H04N19/182H04N19/51H04N19/567
Inventor 王进祥蔡祎炜付方发徐伟哲王瑶唐润龙
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products