Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fused reality interaction system and method

An interactive system and realistic technology, applied in the field of virtual reality, can solve problems such as experience activities and field of view limitations, and the inability to directly view real targets

Inactive Publication Date: 2021-09-17
Z2D VISION TECH (NANJING) CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

User experience activities and field of view will be limited and cannot directly view real-world targets

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fused reality interaction system and method
  • Fused reality interaction system and method
  • Fused reality interaction system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0021] figure 1 It is a schematic structural diagram of a fusion reality interactive system provided by Embodiment 1 of the present invention. Such as figure 1 As shown, the fusion reality interactive system includes: a naked-eye 3D display device 110 , an interactive device 120 and a processing device 130 .

[0022] The naked-eye 3D display device 110 is configured to acquire and display a fusion reality image.

[0023] The interaction device 120 is configured to collect the physical image of the physical object and / or the interaction input data of the physical object.

[0024] The processing device 130 is configured to respectively acquire a virtual image, a physical image, and / or interactive input data, and map the adjusted virtual image and physical image into a fusion coordinate system according to the interactive input data, so as to form the The fused reality image is provided to the naked-eye 3D display device.

[0025] Wherein, the physical object may be the user ...

Embodiment 2

[0041] figure 2 It is a schematic structural diagram of a fusion reality interactive system in Embodiment 2 of the present invention, and this embodiment is refined on the basis of the foregoing embodiments. Such as figure 2 As shown, the fusion reality interactive system includes: a naked-eye 3D display device 110 , an interactive device 120 and a processing device 130 .

[0042] Optionally, the processing device 130 includes:

[0043] The virtual image conversion unit 131 is configured to acquire a virtual image, and convert the virtual image from a virtual coordinate system to a fusion coordinate system according to a virtual fusion transformation relationship;

[0044] The entity image conversion unit 132 is configured to acquire the entity image, and convert the entity image from the entity coordinate system to the fusion coordinate system according to the entity fusion transformation relationship;

[0045] The image fusion unit 133 is configured to superimpose the con...

Embodiment 3

[0059] Figure 3A It is a flow chart of a fusion reality interaction method provided by Embodiment 3 of the present invention. This embodiment is applicable to the scene of interaction in virtual reality, augmented reality or fusion reality system, and the method can be executed by the fusion reality interaction system in the embodiment of the present invention, and the system can be realized by software and / or hardware. Such as Figure 3A As shown, the method specifically includes the following steps:

[0060] Step 310, respectively acquire virtual image, physical image and interactive input data, and map the adjusted virtual image and physical image into a fusion coordinate system according to the interactive input data, so as to form the fusion reality image.

[0061] Entity images can be obtained through interactive devices, virtual images can be obtained through any software platform or software engine model image data, and fused reality images are processed by processi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a fused reality interaction system and method. The system comprises: a naked eye 3D display device which is used for acquiring and displaying a fused reality image; an interaction device which is used for collecting an entity image of an entity object and / or interaction input data of the entity object; and processing equipment which is used for respectively acquiring a virtual image, an entity image and / or interactive input data, mapping the adjusted virtual image and entity image into a fusion coordinate system according to the interactive input data to form a fusion reality image, and providing the fusion reality image to the naked eye 3D display equipment. The system solves the problems that the activity and the visual field are limited due to the fact that a user needs to wear equipment in the experience process, and a real target cannot be directly watched.

Description

technical field [0001] Embodiments of the present invention relate to virtual reality technology, and in particular to a fusion reality interactive system and method. Background technique [0002] With the iterative development of human science and technology, virtual reality technology has penetrated more and more into people's lives. Nowadays, people are more eager to build an information loop capable of interactive feedback between real-world users and the virtual world, so as to enhance users' more realistic experience. [0003] In the existing virtual reality, augmented reality and fusion reality display interaction systems, users need to use wearable devices to experience the virtual world, or interact with the world of "interwoven reality and fantasy". [0004] However, during the user's experience, the wearable device needs to be in contact with the human body. User experience activities and field of view will be limited, and real-world objects cannot be viewed dir...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06F3/01
CPCG06T19/006G06F3/011
Inventor 汪洋
Owner Z2D VISION TECH (NANJING) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products