Method and system for generating reality-fused three-dimensional dynamic image

A three-dimensional dynamic, image generation technology, applied in the field of image processing, to achieve the effects of diverse forms of expression, easy updating and interaction, and high reproducibility

Inactive Publication Date: 2017-09-05
张照亮
View PDF8 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] The purpose of the present invention is to provide a 3D dynamic image generation method and system that integrates reality to solve the problems existing in existing 3D geographies, realize the artistic effect of 3D geographies, and have a wide range of applications and diverse forms of expression. and easy to update and interact with

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for generating reality-fused three-dimensional dynamic image
  • Method and system for generating reality-fused three-dimensional dynamic image
  • Method and system for generating reality-fused three-dimensional dynamic image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0044] see in figure 1 on the basis of further see figure 2 , a kind of 3D dynamic image generation method of fusion reality, described method comprises the following steps:

[0045] S1: Obtain the image 2 of the first real scene 7 in the first three-dimensional space 1 through the first camera 3, and denote the three-dimensional space matrix of the first real scene 7 image 2 in the first three-dimensional space 1 as Q1, and the three-dimensional space matrix Q1 converted into a two-dimensional plane matrix R1 of the first camera 3 through perspective projection;

[0046] S2: converting the obtained two-dimensional plane matrix R1 of the first camera 3 into a corresponding two-dimensional plane matrix R2 in the second three-dimensional space 4;

[0047] S3: converting the two-dimensional plane matrix R2 in the second three-dimensio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method and a system for generating a reality-fused three-dimensional dynamic image. The method comprises steps: a first reality scene image in first three-dimensional space is acquired by a first camera, a three-dimensional space matrix for the first reality scene image is marked as Q1, and the three-dimensional space matrix Q1 is converted to a two-dimensional planar matrix R1 of the first camera through perspective projection; the obtained two-dimensional planar matrix R1 is converted to a corresponding two-dimensional planar matrix R2 in second three-dimensional space; the two-dimensional planar matrix R2 is converted to a two-dimensional planar matrix R3 of a second camera; and the view point of the first camera serves as a human eye view point in reality space, and a first reality scene image corresponding to the two-dimensional planar matrix R3 is projected to a second reality scene corresponding to the view point of the second camera. In a condition with a specified human view point and a specified surrounding environment view point, the actually-seen scene and a planar pattern generated in three-dimensional software are fused, and a naked eye stereoscopic effect is thus formed.

Description

technical field [0001] The invention relates to a method and a system for generating a three-dimensional dynamic image fused with reality, and belongs to the technical field of image processing. Background technique [0002] 3D Street Painting (3D Street Painting) is to display paintings on the ground to achieve a three-dimensional artistic effect, or directly use the ground as a carrier for painting creation. It uses the outdoor ground as a medium and uses the principle of plane perspective to create a visually The virtual three-dimensional effect makes visitors have an immersive feeling. The scenery in 3D ground painting is three-dimensional, delicate and lifelike, and can often achieve the artistic effect of confusing real ones. Simulating the effect of three-dimensional space on a two-dimensional plane has always been the focus of human visual art, especially since the Renaissance, solving this problem has become one of the important standards for art progress and art h...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T13/20G06T13/40G06T17/00G06T19/00
CPCG06T13/20G06T13/40G06T17/00G06T19/006
Inventor 张照亮
Owner 张照亮
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products