Virtual viewpoint hole filling method based on deep background modeling

A technology for virtual viewpoint and background modeling, which is applied in the field of virtual viewpoint hole filling based on depth background modeling, which can solve problems such as unstable foreground, poor depth map quality, and unnatural filling results.

Active Publication Date: 2020-01-07
NINGBO UNIV
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the above methods all construct the background from the perspective of probability. The default assumption is that the background with a higher probability of occurrence in the time domain is the background, which is contrary to some scenarios in practical applications.
In addition, due to the poor quality of the depth map, the edge contour of the foreground may be unstable in the time domain. The background constructed by this method may produce certain artifacts in these areas, resulting in unnatural filling results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual viewpoint hole filling method based on deep background modeling
  • Virtual viewpoint hole filling method based on deep background modeling
  • Virtual viewpoint hole filling method based on deep background modeling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0053] like figure 1 As shown, a virtual viewpoint hole filling method based on depth background modeling includes the following steps:

[0054] Step 1, input depth video and texture video of reference viewpoint;

[0055] Step 2. Perform depth background modeling on the depth video of the reference viewpoint to obtain a depth background image;

[0056] Among them, the method of extracting the depth background image from the depth video of the reference viewpoint can use the existing method, for example: the background depth in the paper "DIBR Hole Filling Method Based on Background Extraction and Partition Restoration" published by the author Chen Yue et al. The image extraction method can also adopt the following methods. In this embodiment, the method for obtaining the depth background image includes the following steps:

[0057] Step 2-1...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual viewpoint hole filling method based on deep background modeling. The method comprises the following steps: inputting a deep video and a texture video of a reference viewpoint; respectively carrying out background modeling on the deep video and the texture video of the reference viewpoint to obtain a depth background image and a texture background image; respectively mapping the depth video and the depth background image of the reference viewpoint and the texture video and the texture background image of the reference viewpoint by using a three-dimensional mapping method to obtain a depth video, a depth background image, a texture video and a texture background image of the virtual viewpoint; and finally, respectively carrying out depth hole filling on thedepth video and the depth background image of the virtual viewpoint, fusing the texture background image of the virtual viewpoint with the texture video of the virtual viewpoint to obtain a fused image of each frame of image in the texture video of the virtual viewpoint; and filling the residual cavity area in each frame of fusion image to obtain a first texture video of the virtual viewpoint. According to the method, the defect of time inconsistency of depth videos is overcome, and the virtual viewpoint perception quality after filling is good.

Description

technical field [0001] The invention relates to the field of hole filling in virtual viewpoint rendering, in particular to a virtual viewpoint hole filling method based on depth background modeling. Background technique [0002] With the vigorous development of computer vision and 3D video technology, multi-view video (MVV) and free view video (FVV) technologies emerged as the times require, allowing viewers to freely choose the viewing position and viewing angle. However, the current technology cannot guarantee continuous and synchronous capture of viewpoint information at any position and angle, and transmit such a large amount of data in real time, so it is necessary to use limited viewpoints to synthesize virtual viewpoints. [0003] The core of FVV is the virtual viewpoint rendering technology. The depth map-based rendering method (DIBR) using the multi-viewpoint plus depth video format (MVD) data format only needs to use a few or even one reference viewpoint to draw al...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T19/00
CPCG06T17/00G06T19/006G06T2207/10016
Inventor 陈芬李壮壮汤锐彬彭宗举蒋刚毅
Owner NINGBO UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products