Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

New viewpoint synthesizing method based on depth images

A technology of depth image and synthesis method, applied in the field of rendering based on depth map, which can solve the problems of affecting image quality, poor experimental effect, and inability to meet the viewing effect of human eyes.

Active Publication Date: 2017-05-31
ZHEJIANG UNIV OF TECH
View PDF9 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to overcome various problems affecting image quality, poor experimental results, and inability to meet the viewing effect of human eyes in the existing new viewpoint image synthesis process, the present invention proposes a method to eliminate various problems affecting image quality, A new viewpoint synthesis method based on depth images with good experimental results and effectively satisfying the viewing effect of human eyes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • New viewpoint synthesizing method based on depth images
  • New viewpoint synthesizing method based on depth images
  • New viewpoint synthesizing method based on depth images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] The present invention will be further described below.

[0054] A new viewpoint synthesis method based on depth image, described viewpoint synthesis method comprises the following steps:

[0055] (1) Perform three-dimensional transformation on the texture map and depth map at the left and right reference viewpoints. The transformation process is as follows:

[0056] 1.1) Project each pixel in the image storage coordinate system, combined with the corresponding depth information, into the world coordinate system:

[0057]

[0058] where P Wi ={P Wi =(X Wi , Y Wi ,Z Wi ) T |i=l, r} represents the three-dimensional coordinates of the pixel at the current position of the reference viewpoint image in the world coordinate system, 1 and r represent left and right respectively, and X Wi and Y Wi are the horizontal and vertical coordinates in the world coordinate system, is the depth, d i is the gray value of the depth map at the current position, MinZ i and MaxZ ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a new viewpoint synthesizing method based on depth images. The new viewpoint synthesizing method comprises the following steps of performing three-dimensional conversion on texture maps and depth maps at a left reference viewpoint and a right reference viewpoint; searching the edge of an object in the depth maps of the left reference viewpoint and the right reference viewpoint, performing three-dimensional conversion on edge pixels to a new viewpoint, and erasing a depth pixel point corresponding to the new viewpoint; performing median filtering on the obtained depth maps, comparing the filtered images and the depth maps after three-dimensional conversion, and marking the changed pixel point; reversely projecting the marked pixel point, projecting to the original reference viewpoint, and granting the pixel value in the initial reference texture map into a pixel which has the same coordinate with the marked pixel point in a new viewpoint image; interpolating the blocking area of the obtained new viewpoint image; restoring the remained void, so as to obtain the final new viewpoint image. The new viewpoint synthesizing method has the advantages that the void and ghost in the new viewpoint image are effectively eliminated, the experiment effect is good, and the generated new viewpoint image meets the human eye viewing effect.

Description

technical field [0001] The invention relates to the fields of image processing, numerical analysis, three-dimensional reconstruction, computer science and the like, in particular to a rendering method based on a depth map for synthesizing new viewpoint virtual images of a binocular camera. Background technique [0002] The new viewpoint virtual image synthesis of the binocular camera is a technology to reconstruct the image at the new viewpoint by using the existing viewpoint image and the internal and external calibration parameters of the camera. The main method is to combine the camera calibration parameters and the depth information of the existing viewpoint image to project and reproject the corresponding texture map to construct an image at a new viewpoint. In the process of building a new viewpoint, there will be many problems, such as cracks, holes, ghosting, area occlusion, and object incompleteness in the newly formed image. The existence of these problems affects ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N13/00
CPCH04N13/111
Inventor 冯远静黄良鹏李佳镜陈丰徐泽楠叶家盛陈稳舟李定邦汪泽南
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products