Unlock instant, AI-driven research and patent intelligence for your innovation.

Method for calculating motion vector, dynamic image coding, decoding method and device

A technology of motion vector and calculation method, applied in the field of predictive coding, which can solve the problems of inability to use motion vector directly and inconsistency in accuracy.

Inactive Publication Date: 2006-12-13
PANASONIC CORP
View PDF8 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0015] However, in the case of the temporal prediction of the direct method, there will be such a problem that when the motion compensation is performed on the block subjected to inter-picture predictive coding by the direct method, since the block referring to the motion vector belongs to figure 1 In the case of a B picture such as B6, the above-mentioned block has multiple motion vectors, so the motion vector calculation made by scaling according to formula 1 cannot be directly used
In addition, since the division operation is performed after calculating the motion vector, the accuracy of the motion vector value (for example, the accuracy of 1 / 2 pixel and 1 / 4 pixel) may not match the predetermined accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for calculating motion vector, dynamic image coding, decoding method and device
  • Method for calculating motion vector, dynamic image coding, decoding method and device
  • Method for calculating motion vector, dynamic image coding, decoding method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0092] Using the block diagram shown in FIG. 6, the video coding method according to Embodiment 1 of the present invention will be described.

[0093] The moving images to be encoded are input into the frame memory 101 in units of images in time order, and rearranged in the order of encoding. Each image is divided into groups called blocks, such as horizontal 16×vertical 16 pixels, and the following processing is performed in units of blocks.

[0094] The blocks read from the frame memory 101 are input to the motion vector detection unit 106 . Here, a decoded image stored in the frame memory 105 is used as a reference image to detect a motion vector of a block to be encoded. At this time, the optimal prediction method is determined by the mode selection unit 107 by continuously referring to the motion vector obtained by the motion vector unit 106 and the motion vector used in the following encoded picture stored in the motion vector storage Section 108. The prediction metho...

Embodiment approach 2

[0117] The outline of the encoding process shown in FIG. 6 is completely equivalent to that of the first embodiment. Here, regarding the bidirectional predictive action in the direct method, the Figure 9 to explain its details.

[0118] Figure 9 An operation is shown when a block referred to for determining a motion vector in the direct method has two motion vectors that refer to two images that are in the rear in order of display time. The picture P43 is a picture currently to be encoded, and bidirectional prediction is performed using the picture P42 and the picture P44 as reference pictures. If the block to be coded is the block MB41, the two necessary motion vectors at this time are determined using the motion vector of the block MB42 that is in the coded rear reference image (referenced by the second reference image). at the same position as the picture P44 of the second reference picture specified by the index. Since this block MB42 has two motion vectors MV45 and ...

Embodiment approach 3

[0139] use Figure 11 The block diagram shown is used to describe the video decoding method according to Embodiment 3 of the present invention. Here, the coded sequence generated by the video coding method according to Embodiment 1 is input.

[0140] First, various information such as a prediction method, motion vector information, and prediction residual error coded data are extracted from the input coded sequence by the coded sequence analyzer 601 .

[0141] The prediction method and motion vector information are output to the prediction method / motion vector decoding unit 608 , and the prediction residual error encoded data is output to the prediction residual error decoding unit 602 . The prediction method / motion vector decoding unit 608 decodes the prediction method and decodes the motion vector used in the prediction method. When decoding the motion vector, the decoded motion vector stored in the motion vector storage unit 605 is used. The decoded prediction method and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

When a current macroblock pair (hereinafter MBP) to be coded is coded in a frame structure referring to frame preceding in display order, in a mode selecting unit (1109), as for a neighbouring MBP coded in a field structure, when both of 2 macroblocks (hereinafter MB) refer to a frame with the smallest first reference index, motion vector of an macroblock to be coded is calculated by relating a MBP to an average value of a motion vector of a MB as a top field of a MBP and a motion vector of a MB as a bottom field. When there is a MB referring to a frame containing the first reference index which is not the smallest, a motion vector of a current MB to be coded is calculated by substituting "0" into the MB.

Description

technical field [0001] The present invention relates to a coding method and a decoding method of a dynamic image, in particular to a plurality of images which have been coded in the front according to the order of display time or a plurality of images which are in the rear according to the order of display time or which are located in front and behind according to the order of display time A method of performing predictive coding by referring to a plurality of pictures on both sides. Background technique [0002] Generally, in the encoding process of dynamic images, the amount of information is compressed by reducing the redundancy in the time direction and space direction. Therefore, in the process of inter-picture predictive coding for the purpose of reducing temporal redundancy, motion detection and motion compensation are performed in units of blocks with reference to front or rear pictures, and the difference between the obtained predicted picture and the current pictur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04N7/32H03M7/36H04N19/51G06T9/00H04N7/12H04N7/26H04N7/36H04N7/46H04N7/50H04N19/103H04N19/105H04N19/109H04N19/127H04N19/137H04N19/176H04N19/503H04N19/61H04N19/70
CPCH04N19/00781H04N19/00278H04N19/00721H04N19/00018H04N19/00024H04N19/00224H04N19/00715H04N19/00884H04N19/00484H04N19/00545H04N19/00103H04N19/00587H04N19/00139H04N19/0003H04N19/00145H04N19/00727H04N19/00266H04N19/105H04N19/176H04N19/70H04N19/172H04N19/46H04N19/51H04N19/61H04N19/103H04N19/107H04N19/127H04N19/136H04N19/137H04N19/16H04N19/423H04N19/583H04N19/573H04N19/58H04N19/577H04N19/52H04N19/513H04N19/109H04N19/159H04N19/184H04N19/30
Inventor 近藤敏志野真也羽饲诚安倍清史
Owner PANASONIC CORP