A virtual-real fusion method of multiple video streams and 3D scenes

A three-dimensional scene, virtual and real fusion technology, applied in the field of virtual reality, can solve the problem that the virtual scene cannot reflect the dynamic changes of the environment, and achieve the effect of enhancing the realism of the scene and user experience, good scalability, and enhancing the relationship between time and space.

Active Publication Date: 2017-05-31
北京大视景科技有限公司
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem that the static virtual scene cannot reflect the dynamic changes of the environment, and proposes a virtual-real fusion method of multiple video streams and 3D scenes. The method dispatches multiple cameras in real time, and fuses their video images with the 3D scene draw

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A virtual-real fusion method of multiple video streams and 3D scenes
  • A virtual-real fusion method of multiple video streams and 3D scenes
  • A virtual-real fusion method of multiple video streams and 3D scenes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be described in further detail below in conjunction with accompanying drawings and examples. The process flow of the multi-video stream and three-dimensional scene virtual-real fusion method proposed by the present invention is as follows: figure 1 As shown, the steps are as follows:

[0027] Step 1, using one or more cameras to collect video images of the environment, such as figure 2 shown. Track the parameter information collected by the camera, and the tracking can be realized by means of sensor recording and offline image analysis. The camera parameters obtained by tracking need to be transformed into the coordinate system in the virtual environment to complete the fusion process.

[0028] Step 2. Construct the viewing frustum of the camera in three-dimensional space according to the parameter information of the camera, such as image 3 shown. exist image 3 Among them, the hexahedron ABCD-EFGH represents the viewing frustum of a c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual and actual reality integration method of multiple video streams and a three-dimensional scene and belongs to the virtual reality technical field. The virtual and actual reality integration method comprises collecting the video image information of the environment through cameras and tracking camera collection parameters; calculating a corresponding vision cone body of every camera in a three-dimensional space according to the camera collection parameters, calculating visible cameras of a user under the current viewpoint based on the corresponding vision cone bodies and scheduling a video image of every visible camera; calculating the association relation of the video image of every visible camera and a virtual object in the three-dimensional scene and performing integration on the video images and the virtual object in the three-dimensional scene according to the association relation; performing visualization on an integration result in the virtual environment and providing the interactive roaming and automatic patrol service for the user. According to the virtual and actual reality integration method of the multiple video streams and the three-dimensional scene, the plurality of cameras in the scene are timely scheduled, the virtual and actual reality integration is performed on the video images of the cameras and the three-dimensional scene, and accordingly the purpose for reflecting the real dynamic of the environment is achieved.

Description

technical field [0001] The invention relates to a virtual-reality fusion method of multi-video streams and three-dimensional scenes, more precisely, the content of multi-video images is fused with virtual objects in three-dimensional scenes, and belongs to the field of virtual reality technology. Background technique [0002] The modeling of the virtual environment is widely used in applications such as simulation, scenic spot display, and 3D maps. When the geometric information and appearance of the virtual environment are similar to or accurately correspond to the real environment, the sense of reality of the entire virtual environment will be improved to a certain extent. However, more accurate modeling of the scene requires a lot of manpower, and because the model texture is a static picture collected in advance, the virtual environment constructed in this way cannot reflect the dynamic changes of events and activities in the real environment. Therefore, reflecting the r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T5/00G06T17/00
Inventor 周忠刘培富周颐吴威
Owner 北京大视景科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products