A virtual and real occlusion processing method based on depth image data stream

A technology of virtual and real occlusion and processing method, applied in image data processing, 3D image processing, image analysis and other directions, can solve the problems of time-consuming, difficult to deal with the occlusion relationship between virtual objects and real scenes, lack of real scene 3D information, etc. To achieve the effect of improving accuracy and robustness

Active Publication Date: 2020-10-13
QINGDAO RES INST OF BEIHANG UNIV +1
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Some of the above methods are time-consuming, such as the extraction of feature points, the optimization of energy equations and other steps, the lack of three-dimensional information of the real scene, it is difficult to deal with the occlusion relationship between virtual objects and real scenes, and the fusion of virtual and real lacks realism

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A virtual and real occlusion processing method based on depth image data stream
  • A virtual and real occlusion processing method based on depth image data stream
  • A virtual and real occlusion processing method based on depth image data stream

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The embodiments of the present invention will be described in detail below in conjunction with the drawings.

[0024] Such as figure 1 As shown, the implementation process of the present invention is mainly divided into four steps: depth data preprocessing, construction of a scene three-dimensional point cloud model, three-dimensional space registration, and virtual and real fusion rendering.

[0025] Step 1: In-depth data preprocessing

[0026] The main steps are:

[0027] (11) For a given depth data in the input RGBD (color + depth) data stream, set the threshold w according to the error range of the depth camera min ,w max , The depth value is in w min With w max The points in between are regarded as credible values, and only the depth data I within the threshold range are retained.

[0028] (12) Perform fast bilateral filtering on each pixel of the depth data, as follows:

[0029]

[0030] Where p j Is pixel p i Pixels in the neighborhood, s is the number of effective pixe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a virtual-reality occlusion handling method based on depth image data flow. The virtual-reality occlusion handling method comprises three parts of construction of a scene point cloud model, three-dimensional space registration and virtual-reality occlusion handling and rendering. Firstly filtering and other processing operation are performed on depth data acquired by a depth camera and the normal vector of each point is calculated; then the camera attitude is calculated by using an iterative closest point algorithm according to the point cloud carrying the normal vector and the point cloud obtained by projection from a three-dimensional scene model through the last frame of camera attitude; then the point cloud of the current frame is fused into the three-dimensional scene point cloud model; when the scene is reconstructed, the color image feature points acquired by the depth camera are calculated in real time and three-dimensional space registration is performed by matching with the template image feature points; and then the space position relation and the occlusion relation of the virtual object and the three-dimensional scene are processed by the obtained camera attitude and rendered in real time. The method can be operated in real time on the present mainstream equipment, and the great virtual-reality occlusion effect can also be obtained when the resolution of the input data is low.

Description

Technical field [0001] The invention belongs to the field of computer vision and computer graphics image processing, and specifically is a virtual and real occlusion processing method based on a depth image data stream. The method can be used when the input data resolution is low and the depth data has holes and noise. Estimate the camera pose in real time and reconstruct the point cloud model of the 3D scene, and process the occlusion relationship between the virtual object and the 3D scene in real time according to the camera pose, and fuse the occlusion processing result with the color image to achieve a virtual and real fusion effect , Is of great significance to the research of 3D reconstruction system and real-time augmented reality (AR) technology. Background technique [0002] Augmented reality is a technology that superimposes virtual objects into the real environment to achieve the fusion effect of virtual and real. In recent years, it has become a research hotspot in t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T19/00G06T7/80G06T5/00G06T15/00G06T5/50G06K9/62
CPCG06T5/002G06T5/003G06T5/50G06T7/80G06T15/005G06T19/006G06T2207/10028G06T2207/20028G06T2207/20221
Inventor 齐越郭小志
Owner QINGDAO RES INST OF BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products