Fast multilevel imagination and reality occlusion method at actuality enhancement environment

A technology of virtual and real occlusion and augmented reality, applied in image data processing, 3D image processing, instruments, etc., can solve problems such as long calculation time, inability to obtain depth maps, and difficulties in real-time processing of live video streams, etc., to meet real-time requirements , good occlusion effect

Inactive Publication Date: 2011-07-20
BEIJING UNIV OF POSTS & TELECOMM
View PDF6 Cites 108 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the use of hierarchical dynamic programming algorithm for frame-by-frame dense stereo matching has the problem of long calculation time, and relatively fast algorithms often cannot obtain high-quality depth maps.
This makes real-time processing of live video streams difficult

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fast multilevel imagination and reality occlusion method at actuality enhancement environment
  • Fast multilevel imagination and reality occlusion method at actuality enhancement environment
  • Fast multilevel imagination and reality occlusion method at actuality enhancement environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The invention is a fast multi-level virtual and real occlusion processing method in an augmented reality environment, which uses a method of combining sparse feature point tracking, pose estimation and three-dimensional reconstruction to process the multi-level virtual and real occlusion of the augmented reality. The present invention comprises the following steps:

[0027] 1) Use dual cameras to collect real-time dynamic stereoscopic video images of real scenes;

[0028] 2) Take key frames from the two-way video stream at regular intervals, calculate the dense depth map for it, establish a 3D model of the real object to be occluded, and extract sparse feature points at the same time;

[0029] Here, a key frame refers to a pair of stereoscopic video image pairs used for 3D modeling of a real scene, and the time interval for extracting a key frame is the calculation cycle of dense stereo matching and modeling of the previous pair of key frames. The significance of this ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a fast multilevel imagination and reality occlusion method at an actuality enhancement environment, comprising the following steps of: acquiring videos of a real scene by using a two-path video camera; extracting a pair of keyframes from a two-path video stream at set intervals to carry out the resolving of a dense depth map, building a three-dimensional model of a real object participating in occlusion, and extracting sparse feature points; tracking the sparse feature points of all middle frames of the two-path video stream, and estimating the posture of a current camera by combining with the positions of the sparse feature points in an image; acquiring the three-dimensional information of the real object according to a model built for the last time; moving and rotating the model of the real object according to the posture of the current camera; carrying out depth relation comparison by utilizing the three-dimensional information of the recently regulated real object with a registered three-dimensional virtual object; and respectively processing the three-dimensional tracking of the middle frames and the three-dimensional reconstruction of the keyframes in parallel at different threads. The invention is suitable for unknown and variable environments without modeling in advance and can meet the requirements for real time.

Description

technical field [0001] The invention relates to a fast multi-level virtual-real occlusion method in an augmented reality environment, which combines 3D reconstruction based on binocular stereo matching and 3D pose estimation based on sparse feature point tracking, and is applied to augmented reality with dual cameras . The invention does not need modeling in advance, is applicable to unknown and changing environments, and can meet real-time requirements. It belongs to the field of virtual reality, image processing and display technology. Background technique [0002] The application of augmented reality technology in fields such as teleoperation robots requires a certain degree of interaction between virtual objects and real objects. The visual occlusion relationship is the most basic kind of interaction. In a simple augmented reality application that does not consider the occlusion relationship, the virtual object image is always completely covered on the video image. Th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/40G06T17/00H04N13/00
Inventor 贾庆轩高欣孙汉旭吴昕宋荆洲胡欢
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products