Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Virtual Viewpoint Rendering Method Based on Depth Image

A virtual viewpoint and depth image technology, applied in the field of 3D video, can solve problems such as artifacts, virtual viewpoint image voids, and mapping points cannot be found, and achieve the effects of avoiding distortion, reducing voids, and improving rendering accuracy

Active Publication Date: 2021-05-28
南通图加智能科技有限公司
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the rounding effect, some pixels in the virtual viewpoint image cannot find corresponding mapping points in the reference viewpoint. In another case, a position in the virtual viewpoint may correspond to multiple different locations in the reference viewpoint. These two situations will correspondingly cause holes and artifacts in the virtual viewpoint image

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Virtual Viewpoint Rendering Method Based on Depth Image
  • A Virtual Viewpoint Rendering Method Based on Depth Image
  • A Virtual Viewpoint Rendering Method Based on Depth Image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0069] The present invention provides an embodiment of the present invention to provide a virtual viewpoint rendering method based on a depth image, refer to figure 1 shown, including:

[0070] S1. Perform local preprocessing on the depth images corresponding to the left and right reference viewpoints;

[0071] S2. For the depth images corresponding to the left and right reference viewpoints and the locally preprocessed left and r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a virtual viewpoint rendering method based on a depth image. First, local preprocessing is performed on the depth image corresponding to the reference viewpoint to reduce the holes while avoiding distortion caused by filtering; The 3D Warping method of the proposed method obtains two virtual views. There are large cracks and holes in the image after 3D transformation. First, the median filter is used to remove the small cracks in the virtual image. Afterwards, the bidirectional expansion method is used to expand the hollow area to eliminate the pixels that may produce artifacts. Then the left and right viewpoints are fused to remove most of the semi-occluded void areas. Finally, an image inpainting algorithm based on a depth map is used to fill in a small number of holes that still exist to ensure the depth consistency between the area to be filled and the target block. The invention can reduce the phenomenon of holes and artifacts in the synthesis technology of the virtual pilot point, improve the drawing precision, and can draw a high-quality virtual viewpoint image.

Description

technical field [0001] The invention relates to the technical field of three-dimensional video, in particular to a virtual viewpoint rendering method based on a depth image. Background technique [0002] The virtual view point rendering technology based on the depth map is an important development direction in IBR technology. It is based on the reference image and the corresponding depth map, using the geometric mapping method to calculate the coordinates of the pixel points in the three-dimensional geometric space, and then re- The method of projection, projecting onto the specified virtual plane to obtain a new viewpoint. [0003] The drawing process of this virtual viewpoint can be divided into two steps: firstly, each pixel point in the reference view is back-projected into the 3D space according to its depth value, and the 3D space point coordinates of the pixels in the image are calculated; Mapping, that is, to re-project the position of the spatial point onto the spe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T3/00G06T5/00G06T7/13G06T7/155G06T7/50
Inventor 祝世平徐豪闫利那
Owner 南通图加智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products