Omni-directional spherical light field rendering method

一种球体、视向的技术,应用在球体光场渲染领域,能够解决无法渲染真实物体等问题,达到节约计算资源、真实性加强、真实浸入感的效果

Active Publication Date: 2020-01-24
PLEX VR DIGITAL TECH CO LTD
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to overcome the disadvantage that the existing rendering engine cannot render real objects, the present invention aims to provide a full-view spherical light field rendering method, which can not only realize real-time rendering, but also use multi-viewpoint images to render various perspectives of real objects Image

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Omni-directional spherical light field rendering method
  • Omni-directional spherical light field rendering method
  • Omni-directional spherical light field rendering method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be further described now in conjunction with accompanying drawing.

[0029] see Figure 1a to Figure 4 , Figure 1a to Figure 4 What is shown is an embodiment of the present invention. This embodiment implements a light field rendering engine using a non-traditional method, which can not only achieve real-time rendering, but also use multi-viewpoint images to render real images of all viewing angles of objects , the specific implementation steps are as follows: Figure 1a to Figure 1d For the data input format input format picture of this embodiment, the input data of this embodiment includes: a low-precision object three-dimensional model (30,000-100,000 faces), 200-300 groups of reference cameras with external and internal parameters, the same number The picture, the file describing the relative position relationship of the reference camera, and the configuration file describing the properties of this data set. For specific data descriptio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a method for rendering a spherical light field in all viewing directions, which includes: a preliminary step, that is, to perform preliminary input and loading of related files; pre-calculate the depth map of the reference camera position densely distributed on the sphere in grid form; move the rendering camera, Its movement range is the surface of the sphere, and calculating and identifying the reference cameras surrounding it around the rendering camera; back-projecting the pixels of the rendering camera, and performing a depth test with the four reference cameras; interpolating the reference cameras that pass the depth test, The result is the final rendered pixel value. The invention can quickly and real-time see the rendering result; objects can be observed from any angle on the spherical surface, and a more realistic sense of immersion can be felt.

Description

technical field [0001] The invention relates to the technical field of computer graphics, in particular to a full-view spherical light field rendering method that can realize real-time rendering of real objects using multi-viewpoint images. Background technique [0002] Currently, known 3D model rendering engines use a combination of models and texture maps to render images using rasterization rendering pipelines. Most of the characters and scenes in the current game field are rendered using this method. However, with the development of virtual reality technology, such artificial rendering results are no longer in line with people's expectations because of the lack of realism. [0003] The current rendering engine mainly has the following two problems: [0004] First, it takes a lot of human resources to create rendering data. Specifically, it means that modelers need to spend a lot of time building a model similar to a real object, and debugging the material and texture o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T15/50G06T3/40
CPCG06T3/40G06T15/50G06T2215/16G06T15/205G06T7/50G06T7/557G06T15/506G06T2207/10028
Inventor 虞晶怡虞煌杰
Owner PLEX VR DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products