Automatic matching correction method of video texture projection in three-dimensional virtual-real fusion environment

A technology of video texture and virtual reality fusion, applied in the field of virtual reality, can solve problems such as lack of too much reference significance, achieve the effect of improving visual effects, overcoming scene conditions, and improving efficiency

Active Publication Date: 2013-07-31
北京微视威信息科技有限公司
View PDF4 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these methods are all applied to two-dimensional images, and have relatively large limitations when used in three-dimensional space.
Moreover, the algorithms related to projector correction at this stage are mostly algorithms for trapezoidal correction of the projection area under the "projector-screen" system, such as: multi-projector image correction method and equipment, application number 201010500209.6, and the correction method is limited to two-dimensional space In the above method, by obtaining the independent image information of the non-overlapping area collected by the two cameras and the correction parameters corresponding to the independent images, the correction process is performed according to the video data of the camera corresponding to

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic matching correction method of video texture projection in three-dimensional virtual-real fusion environment
  • Automatic matching correction method of video texture projection in three-dimensional virtual-real fusion environment
  • Automatic matching correction method of video texture projection in three-dimensional virtual-real fusion environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. It should be understood that the described embodiments are only some of the embodiments of the present invention, not all of them. example. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without making creative efforts belong to the protection scope of the present invention.

[0058] (1) Construct a virtual scene. Use pre-acquired remote sensing images to set terrain textures, construct models with static texture images on the surface and virtual scenes composed of models in virtual space, the spatial position of models in the scene and the relative position, orientation, size and other elements between models should be as close as possible It may be consistent with the real scene.

[0059] (2) Acquire v...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an automatic matching correction method of a video texture projection in a three-dimensional virtual-real fusion environment and a method of fusing an actual video image and a virtual scene. The automatic matching correction method comprises the steps of constructing the virtual scene, obtaining video data, fusing video textures and correcting a projector. A shot actual video is subjected to virtual scene fusion on the surface of the complicated scene such as an earth surface and a building by a texture projection mode, the expression and showing abilities of dynamic scene information in the virtual-real environment are improved, and the layered sense of the scene is enhanced. A dynamic video texture coverage effect of the large-scale virtual scene can be realized by increasing the number of the videos at different shooting angles, so that a dynamic reality effect of the virtual-real fusion of the virtual-real environment and the display scene is realized. The obvious color jump is eliminated, and a visual effect is improved by conducting color consistency processing on video frames in advance. With the adoption of an automatic correction algorithm, the virtual scene and the actual video can be fused more precisely.

Description

technical field [0001] This article relates to virtual reality, in particular to a method for fusing and correcting real video images and virtual scenes, and belongs to the technical fields of virtual reality, computer graphics, computer vision, and human-computer interaction. Background technique [0002] In a virtual reality system, it is the most common method to use static pictures to express the details of buildings or ground surfaces, which are usually realized by texture mapping. The disadvantage of this method is that the texture of the scene surface will not change once it is set, and the neglect of the changing elements of the scene model surface reduces the sense of reality of the virtual environment and cannot give people an immersive feeling. In order to eliminate the lack of realism caused by static pictures, it is an intuitive idea to use video instead of pictures. At this stage, there are also some systems that add video elements, but most of them use the fo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00
Inventor 高鑫光兰江李胜汪国平
Owner 北京微视威信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products