Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for converting 3D model into three-dimensional double-viewpoint view

A dual-viewpoint and model technology, applied in the field of 3D display, achieves strong applicability, realistic and scientific effects

Inactive Publication Date: 2017-07-28
HANGZHOU DIANZI UNIV
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, the present invention starts from the principle of stereo vision, and focuses on how to use OpenGL to extract multi-viewpoint images from computer virtual 3D models, thereby converting them into stereoscopic dual-viewpoint parallax images, and then solving the 3D display problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for converting 3D model into three-dimensional double-viewpoint view
  • Method for converting 3D model into three-dimensional double-viewpoint view
  • Method for converting 3D model into three-dimensional double-viewpoint view

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The present invention will be described in detail below in combination with specific embodiments.

[0058] Such as Figure 1-3 As shown, a method for converting a 3D model to a stereoscopic dual-viewpoint view, specifically includes the following steps:

[0059] Step 1: Select a pooled observation model

[0060] Observation models mainly include convergent observation models and parallel observation models. The present invention selects the convergent observation model.

[0061] attached figure 1 A schematic diagram of convergent projection, attached figure 2 A truncated pyramid for convergent projection. Among them, top, bottom, Left, and Right are the distances from the upper, lower, left, and right sides of the front clipping surface of the pyramid shared by the left and right eyes to the center, Near is the distance from the front clipping surface to the viewpoint, and Far is the distance from the rear clipping surface to the viewpoint.

[0062] Step 2: Calcu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for converting a 3D model into a three-dimensional double-viewpoint view. The method comprises the steps of 1), selecting a to-be-converted 3D model, calculating frustum shift and left and right eye frustum parameters of the 3D model according to input parameters, and establishing a projection matrix and a view matrix of a left eye and the projection matrix and the view matrix of a right eye; 2), obtaining left and right eye MVPs according to the projection matrixes, view matrixes and model matrixes of the left eye and the right eye and introducing the left and right eye MVPs into a shader; 3), carrying out premultiplication on a vertex coordinate of the 3D model and the left eye MVP, introducing a result into the shader, carrying out the premultiplication on the vertex coordinate of the 3D model and the right eye MVP, and introducing the result into the shader, thereby obtaining new vertex coordinates, after each vertex of the 3D model is converted, obtaining left and right eye images of the 3D model; and 4), mapping and splicing the obtained left and right eye images on a screen, thereby obtaining a double-viewpoint disparity map of the 3D model. The method is fast in operation speed, is approximate to visual habits of human eyes and satisfies double-eye convergence simulation.

Description

technical field [0001] The invention relates to 3D display technology, in particular to a conversion method for converting an OpenGL-based 3D model into a stereoscopic dual-viewpoint view, in particular to a method for converting a 3D model into a stereoscopic dual-viewpoint view. Background technique [0002] Humans live in a three-dimensional world and use stereo vision to perceive the world. With the rapid development of computer technology, computers describe the real world in more and more ways: from sound to images to video, the world that computers can represent is becoming more and more more complicated. At present, most display devices are still capable of 2D display, ignoring the depth information. In the era of digitalization and modernization, 2D display can no longer meet the needs of human beings, so 3D models have entered people's lives as a new media form. , study and work, and will soon be accepted by the general public. It has been more and more widely us...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N13/02H04N13/04
CPCH04N13/275
Inventor 麻辉文颜成钢张新李亚菲李宁陈泽伦
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products