Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Context-based adaptive variable length coding for adaptive block transforms

A technology for transforming coefficients and image coding, which is applied in image coding, code conversion, image data processing, etc., and can solve the problem that the block division scheme is not a solution

Active Publication Date: 2008-02-27
NOKIA TECHNOLOGLES OY
View PDF0 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0032] Therefore, the existing block partitioning scheme is not an optimal solution in terms of coding efficiency and quantization accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Context-based adaptive variable length coding for adaptive block transforms
  • Context-based adaptive variable length coding for adaptive block transforms
  • Context-based adaptive variable length coding for adaptive block transforms

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0082] The block segmentation method according to the present invention divides the transform coefficient ABT block (8*8 block, 4*8 block or 8*4 block) into 4*4 blocks, which are coded using the standard 4*4 CAVLC algorithm. The partitioning of the coefficients among the 4x4 blocks is based on the energy of the coefficients to ensure that the statistical distribution of the coefficients in each 4x4 block is similar. The energy of a coefficient depends on the frequency of its corresponding transform function, which can be indicated, for example, by its position in the zigzag scan of the ABT block. As a result of this partitioning, not all coefficients selected to a given 4x4 block are spatially adjacent to each other in the ABT block.

[0083] The method proposed by the present invention operates on a block of coefficients produced using an 8x8, 4x8 or 8x4 transform, which has been scanned in a zigzag pattern (or any other pattern) to produce a sorted coefficient vector.

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method and system for coding an image using context-based adaptive VLC where transform coefficients are partitioned into blocks having a block dimension of 4nx4m (with n, m being positive integer equal to or greater than 1). Each block is scanned in a zigzag manner to produce an ordered vector of coefficients having a length of 16nxm. The ordered vector is sub-sampled in an interleaved manner to produce nxm sub-sampled sequences of transform coefficients prior to encoding the transform coefficients using an entropy encoder.

Description

[0001] This application is a divisional application of an application with a filing date of August 19, 2003, an application number of 03823595.1, and an invention title of "Context-Based Adaptive Variable Length Coding for Adaptive Block Transformation". technical field [0002] The present invention generally relates to the field of video coding and compression, and in particular to a method and system for context-based adaptive variable-length coding. Background technique [0003] A typical video encoder divides each frame of an original video sequence into contiguous rectangular areas called "blocks". These blocks are coded in "intra mode" (I mode), or in "inter mode" (P mode). For P mode, the encoder first searches for a block similar to the block being encoded in the previously sent "reference frame" defined by F ref express. The search is generally limited to no more than a certain spatial shift from the block to be coded. When a best match or "prediction" has been ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N7/26H04N7/50G06T9/00H03M7/40H04N1/413
CPCH04N19/00781H04N7/50H04N7/2625H04N7/26244H04N19/00084H04N19/00296H04N19/00139H04N7/26122H04N7/26101H04N19/00121H04N19/00278H04N7/26106H04N19/176H04N19/13H04N19/122H04N19/61H04N19/136H04N19/18H04N19/60
Inventor M·卡策维茨J·里奇
Owner NOKIA TECHNOLOGLES OY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products