A mixed reality-based scene roaming experience method and experience system

A technology of mixed reality and real scenes, applied in the field of scene roaming experience methods and experience systems, can solve the problems of no three-dimensional effect, inability to understand the decoration pattern, and users' inability to personally experience the real effect of the design scene, so as to avoid the effect of motion sickness

Active Publication Date: 2022-06-10
HANGZHOU QUNHE INFORMATION TECHNOLOGIES CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] When the design scene (such as the architectural interior space design plan) is previewed, the general design scene will be presented to the user in the form of a two-dimensional rendering. The real effect, therefore, can not meet the needs of users
[0003] With the development of VR (Virtual Reality, virtual reality) technology, people use VR head-mounted display devices (including VR glasses, VR goggles, VR helmets, etc.) However, there will be the following problems: when the refresh rate, flicker, gyroscope, etc. of the VR glasses cause high delay problems, it will cause visual motion sickness
Although the technical content shows the future decoration effect for users by means of augmented reality, it only shows the 3D model of the decoration pattern constructed. When the user walks around the room, he can't realize that he can watch the decoration of the area wherever he goes. Pattern, that is, you can't experience the immersive experience effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A mixed reality-based scene roaming experience method and experience system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The mixed reality-based scene roaming experience system provided by an embodiment of the present invention includes: a server, and a service

[0030] The mobile terminal can be a mobile phone, a notebook, a tablet computer, etc., with environmental perception and motion tracking technology, capable of

[0037] Specifically, Unity3d can be used as the mobile terminal rendering engine and the server automatic scene construction application. shift

[0043] (a) determine any number of virtual markers in the virtual scene data.

[0046] The identification picture can be a picture with complex texture information, and the shape of the picture is set according to the virtual marking point,

[0049] (d) Calculate the virtual center of gravity of all virtual markers and the real center of gravity of all real markers.

[0059] Corresponding to the number of facets of the 3D model, the rendering material has a greater impact on the visual effect of the rendering. Therefore, the move ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a scene roaming experience method based on mixed reality, using a mobile terminal with environment perception and motion tracking technology and an AR display device with a semi-transparent display screen; the method includes: the mobile terminal performs virtual scene data received Match the real scene data collected in real time; the mobile terminal updates the first viewpoint and the second viewpoint according to the acquired real-time motion data, and renders the received virtual scene data into the first virtual scene and the second viewpoint in real time with the first viewpoint and the second viewpoint The second virtual scene is displayed on a binocular split screen; the display result is projected onto the translucent display screen of the AR display device; the first real scene and the second real scene respectively received by the left and right eyes of the user are respectively compared with the translucent display screen The first virtual scene interacts with the second virtual scene to realize the roaming experience of the scene. Also disclosed is a scene roaming experience system based on mixed reality, which can bring users an immersive scene experience effect.

Description

A mixed reality-based scene roaming experience method and experience system technical field The invention belongs to the field of image processing, be specifically related to a kind of scene roaming experience method and body based on mixed reality test system. Background technique [0002] When a design scene (such as an architectural interior space design scheme) is previewed, the general design scene will be rendered in two dimensions. It is presented to the user in the form of the rendering, although the rendering is clear and exquisite, but there is no three-dimensional effect, and the user cannot experience the design field personally. Therefore, it cannot meet the needs of users. With the development of VR (Virtual Reality, virtual reality) technology, people use VR head-mounted display devices (including VR glasses, VR goggles, VR helmets, etc.) to watch the presented design scene, although you can see the three-dimensional effect of the design scene Howeve...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06T19/00G06F3/01
CPCG06F3/013G06T19/006
Inventor 朱欣定郑家祥唐睿
Owner HANGZHOU QUNHE INFORMATION TECHNOLOGIES CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products