Training method and device of interpolation filter, video image encoding and decoding method, and codec

An interpolation filter and training method technology, applied in the field of video encoding and decoding, can solve the problems of poor video image encoding and decoding performance and accuracy, and achieve the effects of improving encoding and decoding performance, reducing code stream, and improving prediction accuracy

Pending Publication Date: 2020-04-14
HUAWEI TECH CO LTD +1
View PDF6 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, in the inter-frame prediction mode, when the motion vector points to sub-pixels, it is necessary to perform sub-pixel interpolation on the optimally matched reference block. In the prior art, interpolation filters with fixed coefficients are usually used for sub-pixel interpolation. For the current diversity And non-stationary video signal, the prediction accuracy is poor, resulting in poor codec performance of the video image

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method and device of interpolation filter, video image encoding and decoding method, and codec
  • Training method and device of interpolation filter, video image encoding and decoding method, and codec
  • Training method and device of interpolation filter, video image encoding and decoding method, and codec

Examples

Experimental program
Comparison scheme
Effect test

no. 1 approach

[0449] In a non-target inter prediction mode (such as a non-merge mode), the entropy decoding unit 1601 is specifically configured to parse the index of the motion information of the image block to be decoded from the code stream;

[0450] The inter prediction unit 1602 is further configured to determine the motion information of the currently decoded image block based on the index of the motion information of the currently decoded image block and the candidate motion information list of the currently decoded image block.

no. 2 approach

[0452] In a non-target inter prediction mode (such as a non-merge mode), the entropy decoding unit 1601 is specifically configured to: parse out the index and motion vector difference of the motion information of the image block to be decoded from the code stream;

[0453] The inter prediction unit 1602 is further configured to: determine the motion vector predictor of the currently decoded image block based on the index of the motion information of the currently decoded image block and the candidate motion information list of the currently decoded image block; and, based on the The motion vector prediction value and the motion vector difference value are obtained to obtain the motion vector of the currently decoded image block.

no. 3 approach

[0454] Embodiment 3: In a non-target inter prediction mode (such as non-merge mode), the entropy decoding unit 1601 is specifically configured to parse out the index and motion vector difference of the motion information of the image block to be decoded from the code stream value;

[0455] The inter-frame prediction unit 1602 is further configured to determine the motion vector predictor of the currently decoded image block based on the index of the motion information of the currently decoded image block and the candidate motion information list of the currently decoded image block; further, based on the The motion vector prediction value and the motion vector difference value are obtained to obtain the motion vector of the currently decoded image block.

[0456] In a possible implementation of the embodiment of this application, if the target filter is the above Figures 6A-6D For the second interpolation filter obtained by any of the training methods of the interpolation fi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a training method and device of an interpolation filter, a video image coding and decoding method, and a codec. According to the method, a traditional interpolation filter is employed for interpolation to obtain a first sub-pixel image as label data, so as to train a second interpolation filter, so the second interpolation filter obtained by training can be directly used for interpolation to obtain the pixel value of the first sub-pixel position, the label data is more accurate, and the encoding and decoding performance of the video image is improved.According to the coding method, in an inter-frame prediction process; a target interpolation filter for the current coded image block is determined from a candidate interpolation filter set, so the encoder selects the proper interpolation filter to perform interpolation operation according to the content of the current encoded image block, so that the obtained prediction block predicts the prediction block with higher accuracy, the code stream is reduced, and the compression ratio of the video image is improved.

Description

technical field [0001] The present application relates to the technical field of video coding and decoding, and in particular to a training method and device for an interpolation filter, a video image coding and decoding method, and a codec. Background technique [0002] Digital video capabilities can be incorporated into a wide variety of devices, including digital television, digital broadcast systems, wireless broadcast systems, personal digital assistants (PDAs), laptop or desktop computers, tablet computers, electronic book readers, Digital cameras, digital recording devices, digital media players, video game devices, video game consoles, cellular or satellite radiotelephones (so-called "smart phones"), video teleconferencing devices, video streaming devices and the like . Digital video devices implement video compression techniques such as those defined in MPEG-2, MPEG-4, ITU-T H.263, ITU-T H.264 / MPEG-4 Part 10 Advanced Video Coding (AVC), Video compression technique...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N19/176H04N19/42H04N19/503H04N19/80
CPCH04N19/176H04N19/42H04N19/503H04N19/80H04N19/59H04N19/117H04N19/147H04N19/109H04N19/513H04N19/587H04N19/184H04N19/82H04N19/51
Inventor 吴枫闫宁刘东李厚强杨海涛
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products