Block matching parallax estimation-based middle view synthesizing method

A parallax estimation and intermediate view technology, applied in the field of image-based virtual viewpoint rendering, can solve problems such as difficult algorithm, large amount of data, and inapplicability to real scenes

Inactive Publication Date: 2011-11-23
BEIHANG UNIV
View PDF3 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Among many image-based rendering methods, although the method that does not use geometric information at all can reconstruct the viewpoint quickly, it has a poor sense of reality and a large amount of data; the virtual viewpoint view obtained by the method based on the depth map is better, but the dept

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Block matching parallax estimation-based middle view synthesizing method
  • Block matching parallax estimation-based middle view synthesizing method
  • Block matching parallax estimation-based middle view synthesizing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The present invention is a virtual viewpoint rendering method based on adaptive parallax estimation, which belongs to the category of virtual viewpoint rendering in the field of multi-eye digital image processing, and specifically performs parallax estimation of self-adaptive selection window mode for two images captured synchronously by a horizontal camera group , to obtain a two-way disparity map, and then according to the obtained disparity information and the grayscale information of the input view, according to the reverse mapping principle, treat each coordinate position in the synthesized view, and find the view closest to the viewpoint to be synthesized in the left and right views respectively The corresponding points of the grayscale are finally weighted by brightness to obtain the virtual viewpoint view to be synthesized.

[0048] All "views" and "images" in this example refer to digital bitmaps. The abscissa is from left to right, and the ordinate is from top ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a self-adapting parallax estimation-based virtual viewpoint drawing method, which is used for carrying out parallax estimation on two images simultaneously shot by a horizontal camera set and obtaining a virtual view at a random position among input viewpoints through interpolation and inverse mapping according to solved parallax views. The method comprises the steps of: firstly, respectively taking a left image and a right image as target images, carrying out self-adapting parallax estimation on a searching window so as to solve two parallax views which take the left image and the right image as the target images and adjusting a size of the window according to texture similarity of the two views; and then, carrying out virtual view drawing on appointed viewpoints according to solved parallax to obtain a final result. In the parallax estimation process of the block matching parallax estimation-based middle view synthesizing method, only horizontal searching is carried out, the amount of calculation is greatly reduced; and a method for self-adapting the searching window is provided, thus the accuracy for the parallax estimation is enhanced. The block matching parallax estimation-based middle view synthesizing method provided by the invention can obtain an excellent synthesizing result under the condition that the camera set is horizontally arranged and the distance from a foreground of a view to cameras is farther.

Description

technical field [0001] The invention relates to a multi-objective digital image processing method, in particular to an image-based virtual viewpoint rendering method. Background technique [0002] Video processing, transmission and display technologies have undergone multiple changes from black and white to color, from analog to digital, from standard definition to high definition, and the goal of the next generation of change will be a 3D stereoscopic video display system. The two-dimensional video flat display system that has been popularized at present cannot express the depth information of objects in natural scenes, so that users lack a sense of three-dimensionality when watching, and what they watch is inconsistent with natural scenes, and there is distortion caused by the loss of third-dimensional information. Therefore, obtaining the free viewpoint display structure of the viewpoint view required by the user becomes an important basis for the transformation from 2D t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T17/00G06T7/00H04N13/00
Inventor 祝世平于洋
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products