Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system

A virtual model, virtual and real occlusion technology, applied in image data processing, 3D image processing, instruments, etc., can solve the problems of large amount of calculation and unsatisfactory realization effect, and achieve real-time requirements and good virtual and real occlusion effect Effect

Inactive Publication Date: 2014-01-01
BEIJING UNIV OF POSTS & TELECOMM
View PDF8 Cites 83 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This type of method requires a large amount of calculation to solve the depth information, and it is necessary to realize the realistic rendering of the real occluder when the image is superimposed. When the real occluder has a complex shape and occupies a large area, the effect is not very ideal.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The present invention is a virtual-real occlusion processing method based on virtual model preprocessing in an augmented reality system, which adopts the method of registering the extracted occluder outline in the scene of rendering the virtual model and drawing it faithfully to process the virtual-real occlusion of augmented reality.

[0021] As shown in the accompanying drawings. The overall steps of the method of the present invention are: using the depth camera KINECT to obtain the color image of the scene and the grayscale image representing the depth information; converting the color image into a bitmap image that can be identified and tracked by the augmented reality virtual and real occlusion system and registering the virtual model in three dimensions; Combined with the 3D registration position of the virtual model and the depth of the virtual model itself, the grayscale image representing the depth information is thresholded and the outer contour of the real ob...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a virtual reality occlusion handling method, based on a virtual model pretreatment, in augmented reality system. The method comprises the steps of utilizing a depth camera KINECT to obtain a color image and a gray level image representing depth information; converting the color image into a bitmap image capable of being identified and tracked by an augmented reality occlusion system and registering a virtual model in a three-dimension mode; combining the three-dimensional registration position of the virtual model and the self depth of the virtual model to conduct threshold treatment on the gray level image and extracting the peripheral contour of a real object; in the render scene of the virtual model, registering a contour coordinate system in a three-dimensional mode; switching a two-dimensional outline vertex coordinate system into a three-dimensional coordinate corresponding to the actual size, drawing in the contour coordinate system, and using a re-drawn contour as a three-dimensional model to shield the virtual model; combining the color image and the treated virtual model, filling a real object image into an outline internal area, namely, the occlusion part of the virtual model, so as to obtain a virtual reality occlusion effect. According to the method, pre-modeling and comparing the depth information of the virtual model pixel by pixel are not needed, so that the method is suitable for environments with unknown changes and can meet instantaneity requirements.

Description

Technical field: [0001] The invention relates to a virtual-real occlusion processing method based on virtual model preprocessing in an augmented reality system, which combines contour extraction with virtual model three-dimensional registration and rendering technology, and is applied to an augmented reality system with a depth camera KINECT. The invention does not need to pre-model and compare the depth information of the virtual model pixel by pixel, is suitable for environments with unknown changes, and can meet real-time requirements. The invention belongs to the field of virtual reality, image processing and display technology. Background technique: [0002] The application of augmented reality technology in the field of space robot teleoperation requires the real reproduction of the mutual positional relationship between virtual objects and real objects. The augmented reality system superimposes the image of the virtual object directly on the location of the marker in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T15/40G06T7/00
Inventor 宋荆洲杨琼贾庆轩孙汉旭
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products