Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Joint estimation method for movement and parallax error in multi-view video coding

A multi-view video and viewpoint direction technology, applied in the field of motion and parallax joint estimation algorithms, can solve the problems of reducing the complexity of motion estimation and parallax estimation, waste, and not making full use of multi-view video, and achieves reliable and guaranteed initial prediction values. The effect of accuracy

Inactive Publication Date: 2009-12-09
BEIJING UNIV OF TECH
View PDF0 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Ding et al. [5] used the high similarity of the corresponding blocks of adjacent views to find the corresponding blocks in the reference frames of adjacent views through disparity estimation. The coded information such as the mode and motion vector of the coded view can be used again, but this The method is only for motion estimation, and the disparity estimation still uses the full search method
[0005] To sum up, the current fast algorithms for motion estimation and disparity estimation are relatively independent, and do not make full use of the characteristics of multi-viewpoint video, and combine the relationship between adjacent viewpoints to design a joint motion and disparity estimation algorithm to reduce motion estimation and disparity at the same time. estimated complexity
Therefore, most of the encoding time is still wasted on motion estimation and disparity estimation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Joint estimation method for movement and parallax error in multi-view video coding
  • Joint estimation method for movement and parallax error in multi-view video coding
  • Joint estimation method for movement and parallax error in multi-view video coding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] Multi-viewpoint video refers to the k+1 video sequence {S 0 , S 1 , S 2 …S k}, each video contains from T 0 Time to T n n+1 frames of images at a time. figure 1 is the encoding frame of multi-view video, the vertical direction is the viewpoint direction, and the horizontal direction is the time direction. The first frame of each video is the anchor frame, such as S i / T 0 the B 0 The frame is an anchor frame, and the other frames are coded in units of picture groups. Each image group consists of an anchor frame and multiple non-anchor frames, let N GOP Represents the number of frames contained in an image group, N GOP The value of is an integer power of 2, 12 or 15. In an image group, usually the frame at the end of the image group is the anchor frame, for example, N in the figure GOP = 12, S 1 / T 12 moment of B 0 frame is an anchor frame. When encoding, the anchor frame is first encoded independently, and then each non-anchor frame is encoded according ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A joint estimation method for movement and parallax error in multi-view video coding of the invention has the following steps: 1. taking parallax error vector and movement vector of image corresponding block of same view and adjacent moment as the initial value of the current coding block parallax error vector; 2. comparing the initial value with the prediction vectors of coded adjacent blocks respectively, and selecting the optimal prediction vector as the search initial point according to the match error minimum rule; and 3. combining geometric relationship between the movement vector and parallax error vector of adjacent images, estimating and obtaining a candidate from a previous movement / parallax error to perform the next movement / parallax error, correcting the current movement and parallax error vector continuously until obtaining the optimal movement vector and parallax error vector of the current coding block. The method only needs one time search process to simultaneously confirm the optimal movement vector and parallax error vector. Compared with full search algorithm, the peak SNR of the invention reduces not more than 0.09dB, code rate bits are slightly saved, code rate ranges between minus 14.20% and 0.60% and coding time is saved by over 90%.

Description

technical field [0001] The invention relates to the field of H.264-based multi-viewpoint video coding, in particular to a motion and parallax joint estimation algorithm in multi-viewpoint video coding. Background technique [0002] Multi-viewpoint video contains the depth information of the scene, and is more realistic in the representation of natural scenes. It has broad application prospects in the fields of 3D TV, free-viewpoint TV, visual conference with a sense of presence, and virtual reality [1] . The multi-viewpoint video technology with "three-dimensional" and "interactive" is getting more and more attention from academia and industry, and has become one of the research hotspots in recent years. [0003] Compared with traditional single-viewpoint video, the amount of data to be processed by multi-viewpoint video doubles with the increase of the number of cameras, which brings a huge burden to transmission and decoding. Therefore, how to efficiently compress and en...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N7/26H04N13/00H04N19/105H04N19/187H04N19/577H04N19/597
Inventor 贾克斌邓智玭刘鹏宇
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products