Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A projection mapping method of video in 3D scene based on screen space

A 3D scene, screen space technology, applied in the field of texture mapping and projection

Active Publication Date: 2022-07-19
ZHEJIANG UNIV
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention proposes a screen-space-based video projection mapping method in a three-dimensional scene, with the purpose of solving the problem of occlusion penetration caused by missing depth values ​​in the projection texture mapping algorithm. By projecting the texture into the three-dimensional scene, the texture and scene better blend together

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A projection mapping method of video in 3D scene based on screen space
  • A projection mapping method of video in 3D scene based on screen space
  • A projection mapping method of video in 3D scene based on screen space

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The present invention will be described in detail below with reference to the embodiments and the accompanying drawings, but the present invention is not limited thereto.

[0036] The flow of the texture projection mapping algorithm based on screen space in this embodiment is as follows: figure 1 As shown, it includes three steps of preprocessing for the projection source, starting from the current viewpoint to draw the 3D scene, and starting from the projection source to draw the texture.

[0037] (1) Preprocessing process

[0038] For each projection source in the scene, it is necessary to first obtain parameter information such as the position and orientation of the projection source in the 3D scene. According to the acquired internal and external parameters of the projection source, the model matrix M, the viewpoint matrix V and the projection matrix P of the projection source can be calculated respectively, and the MVP matrix 7T of the projection can be obtained b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This invention discloses a screen space-based video projection mapping method in a three-dimensional scene. ; Draw the texture from the projection source, and convert each pixel of the projection image to the screen space of the viewpoint for projection calculation. The problem of occlusion penetration caused by missing depth values ​​in traditional projected texture mapping is improved, not only the drawing effect is significantly enhanced, but also the drawing batches are reduced, and the drawing efficiency is higher. The algorithm of the invention is clear, the result is robust, and the operation efficiency is high, which is very suitable for being applied to a real-time rendering system, and the method can be well combined with a large-scale three-dimensional video monitoring system.

Description

technical field [0001] The invention relates to texture mapping and projection in computer graphics, in particular to a projection mapping fusion method of video based on screen space in a three-dimensional scene. Background technique [0002] In recent years, for the direction of 3D video surveillance, the current common method is to collect video data from surveillance cameras in various locations in the city, and display it in 3D scenes in the form of annotations. If the monitoring personnel want to call the monitoring of a certain place, they only need to turn on the monitoring camera in the corresponding area, and then they can observe the current real-time monitoring video. However, this method simply combines the two-dimensional surveillance video with the three-dimensional scene, that is, watching the surveillance video in the three-dimensional scene. Compared with the video wall, it only increases the 3D position information of the camera, and does not fully exert ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T15/04
CPCG06T15/04
Inventor 郑文庭李融鲍虎军
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products