Unlock instant, AI-driven research and patent intelligence for your innovation.

A virtual-real fusion method with specular objects and transparent objects in the scene

A technology for transparent objects and objects, applied in 3D image processing, image enhancement, instruments, etc., can solve problems such as impact

Active Publication Date: 2022-06-21
JILIN UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to address the deficiencies of the existing methods, and propose a method for fusion of virtual and real images with consistent illumination that is applicable to specular objects and transparent objects in the scene, and solves the problem of caustics of specular objects and transparent objects in the actual scene. The impact that this method adopts is:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A virtual-real fusion method with specular objects and transparent objects in the scene
  • A virtual-real fusion method with specular objects and transparent objects in the scene
  • A virtual-real fusion method with specular objects and transparent objects in the scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] The core content of the present invention is that reflection between objects is considered in the initial light source estimation, and a more accurate estimation result is obtained. The WardBRDF model parameters of specular objects, the refractive index and color attenuation coefficient of transparent objects are estimated, and the position of the light source is optimized at the same time. Using the estimated light source and model parameters for differential rendering, a more realistic virtual-real fusion effect is obtained.

[0062] In order to make the purpose of the present invention, technical scheme and advantage clearer, further detailed description is done below in conjunction with accompanying drawing and example:

[0063] 1.1 Use an RGB-D camera to shoot scenes with specular objects and transparent objects, and obtain depth images and color images from different perspectives; perform 3D reconstruction of the scene, and obtain the 3D model positions of mirror ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A virtual-real fusion method in which specular objects and transparent objects exist in a scene belongs to the technical field of computer virtual reality. The present invention first uses an RGB-D camera to photograph a scene in which specular objects and transparent objects exist, and recognizes the positions of the specular objects and transparent objects, and Perform 3D reconstruction of the scene. The reflection between objects is considered in the initial light source estimation, and the material parameters of specular objects and transparent objects are estimated. The estimated lighting results and model parameters are used for differential rendering to obtain the virtual-real fusion effect map. By estimating the BRDF model parameters of the specular object, the refractive index and the color attenuation coefficient of the transparent object, the present invention obtains a more realistic virtual-real fusion effect, and solves the problem of the virtual-real fusion illumination consistency of the mirror object and the transparent object in the scene. At the same time, the present invention proceeds from the optical principle when estimating the position of the light source, and considers the reflection between objects to obtain a more accurate position of the light source.

Description

technical field [0001] The invention belongs to the technical field of computer virtual reality, in particular to a method for estimating the position and intensity of a light source in a scene, the reflection coefficient of a specular object, the refractive index and color attenuation coefficient of a transparent object. Background technique [0002] Augmented reality technology combines the generated virtual objects with the actual scene and presents them in front of the user's eyes. In order to achieve a more realistic virtual-real fusion effect, it is necessary to make the virtual objects present the same lighting effect as the actual scene. The main consideration of lighting consistency should be the color and brightness changes caused by real light sources and real objects in the scene to the surface patches of virtual objects. [0003] Existing methods for solving illumination consistency virtual-real fusion are mainly divided into three categories: methods with auxil...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00G06T15/50G06T19/00G06T5/50
CPCG06T19/006G06T15/50G06T5/50G06T17/00
Inventor 赵岩张艾嘉王世刚王学军
Owner JILIN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More