Selecting macroblock coding modes for video encoding

a coding mode and macroblock technology, applied in the field of video coding, can solve the problem of very intensive computation of the rate-distortion optimized coding mode decision

Inactive Publication Date: 2005-12-15
MITSUBISHI ELECTRIC RES LAB INC
View PDF3 Cites 101 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0026] A rate required to code the quantized difference is determined. A distortion is determined according to the difference and the reconstructed difference. Then, a cost is determ

Problems solved by technology

Therefore, the computation of the rate-distortion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Selecting macroblock coding modes for video encoding
  • Selecting macroblock coding modes for video encoding
  • Selecting macroblock coding modes for video encoding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Our invention provides a method for determining a Lagrange cost, which leads to an efficient, rate-distortion optimized macroblock mode decision.

[0031] Method and System Overview

[0032]FIG. 3 shows the method and system 300, according to the invention, for selecting an optimal coding mode from multiple available candidate coding modes for each macroblock in a video. The selection is based on a Lagrange cost for a coding mode of a macroblock partition.

[0033] Both an input macroblock partition 101 and a predicted 312 macroblock partition prediction 322 are subject to HT-transforms 311 and 313, respectively. Each transform produces respective input 301 and predicted 302 HT-coefficients. Then, a difference 303 between the input HT-coefficient 301 and predicted HT-coefficient 302 is determined 314. The difference 303 is quantized 315 to produce a quantized difference 304 from which a coding rate R 306 is determined 317.

[0034] The quantized difference HT-coefficients are also su...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A method selects an optimal coding mode for each macroblock in a video. Each macroblock can be coded according a number of candidate coding modes. A difference between an input macroblock and a predicted macroblock is determined in a transform-domain. The difference is quantized to yield a quantized difference. An inverse quantization is performed on the quantized difference to yield a reconstructed difference. A rate required to code the quantized difference is determined. A distortion is determined according to the difference and the reconstructed difference. Then, a cost is determined for each candidate mode based on the rate and the distortion, and the candidate coding mode that yields a minimum cost is selected as the optimal coding mode for the macroblock.

Description

RELATED APPLICATION [0001] This application is related to U.S. patent application Ser. No. ______, “Transcoding Videos Based on Different Transformation Kernels” co-filed herewith by Xin et al., on Jun. 1, 2004, and incorporated herein by reference.FIELD OF THE INVENTION [0002] The invention relates generally to video coding and more particularly to selecting macroblock coding modes for video encoding. BACKGROUND OF THE INVENTION [0003] International video coding standards, including MPEG-1, MPEG-2, MPEG-4, H.261, H.263 and H.264 / AVC, are all based on a basic hybrid coding framework that uses motion compensated prediction to remove temporal correlations and transforms to remove spatial correlations. [0004] MPEG-2 is a video coding standard developed by the Motion Picture Expert Group (MPEG) of ISO / IEC. It is currently the most widely used video coding standard. Its applications include digital television broadcasting, direct satellite broadcasting, DVD, video surveillance, etc. The ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04N7/32G06K9/36G06K9/46H04N7/26
CPCH04N19/176H04N19/147H04N19/40H04N19/19H04N19/122
Inventor XIN, JUNVETRO, ANTHONYSUN, HUIFANG
Owner MITSUBISHI ELECTRIC RES LAB INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products