Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method and device for adaptive inverse quantization in video coding

A video coding and adaptive technology, applied in the field of data processing, can solve problems such as limiting coding efficiency

Active Publication Date: 2019-04-19
HUAWEI TECH CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0024] The present invention provides an adaptive inverse quantization method and device in video coding. The method and device provided by the present invention solve the problem in the prior art that the side information corresponding to the quantization parameter limits the improvement of coding efficiency to a certain extent.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and device for adaptive inverse quantization in video coding
  • A method and device for adaptive inverse quantization in video coding
  • A method and device for adaptive inverse quantization in video coding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0122] Such as figure 1 As shown, an embodiment of the present invention provides an adaptive inverse quantization method in video coding, the method comprising:

[0123] Step 101, determine the image area corresponding to the first transform coefficient set in the current decoded image; the first transform coefficient set includes N transform coefficients, and the transform coefficients are the transform coefficients of any color space component in the current decoded image, wherein, N is a positive integer;

[0124] In the embodiment of the present invention, the first transform coefficient set may include N transform coefficients A(i), i=1, 2,..., N, where N is a positive integer, for example, N=1, 2, 4, 16 . Transform coefficient, transform coefficient of any component of RGB (such as R component).

[0125] The image area corresponding to the first transform coefficient set is the area corresponding to the first transform coefficient set in the current decoded image. Fo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an adaptive inverse quantization method and device in video coding. The method provided by the invention uses the spatial neighborhood information, temporal neighborhood information, or spatial neighborhood information and temporal neighborhood information of a transform block to estimate the current transform Statistical characteristics of the background area where the block is located, adaptively derive the quantization adjustment factor of inverse quantization, and flexibly adjust the inverse quantization process. Compared with the existing scheme of transmitting the quantization adjustment information in the code stream, the method and device provided by the present invention do not need to provide additional bit overhead to transmit the quantization adjustment information, thus further improving the coding efficiency.

Description

technical field [0001] The invention relates to the field of data processing, in particular to an adaptive inverse quantization method and device in video coding. Background technique [0002] Current video coding technologies include various video coding standards, such as H.264 / AVC, H.265 / HEVC, Audio Video Coding Standard (AVS) and other video coding standards. The above-mentioned video coding standards usually adopt a hybrid coding framework. The mixed coding framework mainly includes the following links: [0003] Prediction (prediction), transformation (transform), quantization (quantization), entropy coding (entropycoding) and other links. [0004] In the prediction process, the reconstructed pixels of the coded region are used to generate predicted pixels of the original pixels corresponding to the current coded block. Prediction methods mainly include intra prediction and inter prediction. Intra prediction uses the reconstructed pixels of the current coding block i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N19/124H04N19/48H04N19/182H04N19/186H04N19/593
CPCH04N19/126H04N19/136H04N19/176H04N19/18H04N19/60H04N19/119H04N19/124H04N19/182H04N19/186H04N19/184H04N19/30
Inventor 赵寅杨海涛吕卓逸
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products