Multi-user virtual reality interaction environment

a virtual reality and environment technology, applied in the field of multi-user interaction with virtual objects, can solve the problems of not providing users with ways to interact in a shared physical space, providing absolutely no way for multiple users to simultaneously share virtual spaces, and not being able to enable multi-user interaction

Inactive Publication Date: 2015-07-09
GREK ANDREJ
View PDF6 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0086]Thus, according to embodiments of the present invention it is possible to provide a multi-user interaction with detailed virtual objects of a shared virtual space that is at the same time superimposed for all users identically on a shared physical space, and thereby create a multi-user virtual reality interaction environment experience with simultaneous virtual and physical collaboration and communication of users.
[0087]Further, according to embodiments of the present invention it is possible to provide multi-user interaction with detailed virtual objects of a shared virtual space superimposed on a shared physical space, using portable interactive devices movement of which is tracked in the shared physical space, enabling navigation within the shared virtual space and interaction with detailed virtual objects, acting as virtual space view controlling devices and virtual cursor pointing devices, capable of registering input signals and at the same time displaying views of the shared virtual space.

Problems solved by technology

Various examples exist that provide users with such virtual space and virtual object interaction capabilities, and while utilizing a wide range configurations of display and input devices, the examples are not capable of enabling multi-user interaction with detailed virtual objects of a shared virtual space superimposed for all users identically on a shared physical space.
Computer entertainment software applications which enable multi-user access of virtual spaces are vastly focused on providing online connectivity of geographically separated users, and in such scenarios understandably do not provide users with ways to interact in a shared physical space.
Specialized engineering and design software applications that display views of virtual spaces, on the other hand, provide absolutely no way for multiple users to simultaneously share virtual spaces and collectively interact with virtual objects of those virtual spaces.
Therefore, despite being able to move to perform input signals, users are attached to a static display device and cannot rotate around and effectively utilize the shared physical space for interaction.
Such software applications, which are using static display devices for displaying views of virtual spaces, are not capable of creating a shared virtual space superimposed on a utilizable shared physical space.
This requirement limits usability of such interaction mechanisms in some multi-user scenarios, where physical face-to-face user communication is necessary, rendering software applications using the interaction mechanisms inappropriate.
Furthermore, while using these software applications users are also attached to the static motion-registering unit to perform motion-based input signals and are unable to fully utilize the physical space for interaction.
The limitation of the previously mentioned software applications, due to which users cannot communicate face-to-face in physical space, applies as well.
Software applications for interaction with virtual spaces using head-mounted display devices for displaying views of virtual spaces are also more difficult to implement than those utilizing conventional display devices.
User interaction with such objects however, requires providing special software functions for zooming in on the details of the virtual objects or positioning of heads of users in anatomically very difficult, if not completely impossible to achieve, positions.
Movement of a head in space requires the whole body to adjust and follow the movement, which can be very difficult.
Viewing small parts of detailed virtual objects may require users to position their heads into positions that are out of their reach, and therefore may not be possible at all.
The ability to interact with small parts of detailed virtual objects is therefore highly limited using such software applications.
When a user of a multi-user group performs such input signals, even if these software applications generate a shared virtual space superimposed on a shared physical space, there is no possibility of these software applications maintaining a shared virtual space that is at the same time superimposed for all users identically on a shared physical space.
Additionally, users who interact with virtual spaces superimposed on physical spaces using these software applications in conjunction with head-mounted display devices that are tracked in physical space, and who also use software functions for zooming in on details of virtual objects, cannot be at the same time present in a shared physical space.
The problem that prevents users from being able to be in a shared physical space is that their shared virtual space is superimposed for each user differently.
Due to head-mounted display devices restricting view of users into physical space, users would not be able to know true positions of other users in their shared space and would involuntarily collide with each other.
Although by using these software applications, users are able to see when a shared virtual space is superimposed on a shared physical space differently from other concurrent users, this method has most of the previously mentioned limitations.
Using head-mounted display devices tracked in physical space for interacting with detailed virtual objects of virtual spaces is impractical in general.
Users cannot view small parts of detailed virtual objects or achieve certain viewing angles on virtual objects, as that would involve them positioning their heads in awkward or impossible positions.
Furthermore, such software applications for interaction with virtual spaces that utilize augmented reality and display views of virtual spaces overlaying views of physical space are compromising on the image quality of one of the views.
When the combined views are displayed using head-mounted display devices that restrict view of users into surrounding physical space, both views are image streams and the image stream containing view of physical space is of lowered quality, due to it being captured by a physical camera, introducing image noise.
Therefore, the resulting image quality of head-mounted display devices that are used with such software applications that utilize augmented reality, is always lower than the image quality of display devices displaying only views of virtual spaces.
In spite of the mentioned capabilities, these software applications are not capable of generating a shared virtual space that is at the same time superimposed for all users identically on a shared physical space.
These software applications are nonetheless limited, by being dependent on reference objects or images, which are used as markers when being tracked by the handheld devices that are tracking their surrounding physical space.
Multiple users would not be able to share the same physical space, as they would block handheld devices of each of the users from tracking the reference objects located on the perimeter of their surrounding physical space.
Software applications utilizing handheld device based tracking, are therefore unable to create a shared virtual space superimposed for all users identically on a shared physical space.
While displaying views of virtual spaces superimposed on the physical space on the special-purpose handheld device and allowing multiple users to be present and communicate physically in the same space pose no problem to these software applications, for tracking they rely solely on motion capture cameras with a narrow field of view, making the system unfeasible and unsuitable for being used in regular indoor environments.
It is therefore not possible, while using these software applications, to create a shared virtual space that is at the same time superimposed for multiple users identically on a shared physical space.
These software applications are missing a mechanism of precisely identifying points in virtual space, rendering interaction with details of detailed virtual objects extremely difficult or completely impossible.
Moreover, these software applications use handheld devices that are positioned by hands of users in physical space, due to what it is extremely difficult to interact with virtual spaces by using display devices of sizes and weight comparable to desktop display devices.
Sizes and weight of display devices used by handheld devices are therefore limited to sizes and weight of mobile devices.
Finally, most of the software applications for interaction with virtual spaces using mobile handheld devices such as tablets are relying on computing devices included in the handheld devices for processing power, and therefore have limited processing capabilities when compared to stationary computing devices.
Therefore, displaying virtual spaces, which are comprised of many detailed virtual objects containing a vast amount of geometric features, is impossible using these software applications.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-user virtual reality interaction environment
  • Multi-user virtual reality interaction environment
  • Multi-user virtual reality interaction environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0116]Embodiments of the present invention describe methods, systems, devices and non-transitory computer readable storage mediums storing one or more programs for enabling multi-user interaction with detailed virtual objects of a shared virtual space at the same time superimposed for all users identically on a shared physical space. Embodiments of the invention will be described with reference to the accompanying drawing figures wherein like numbers represent like elements throughout. Before embodiments of the invention are described in detail, it should be understood that the invention is not limited in its application to the details of the examples set forth in the following description or illustrated in the figures.

[0117]It will be obvious, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well-known methods, operations, techniques, procedures, systems, storage mediums, circuitry, ne...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Systems, devices, methods and computer-readable storage mediums storing instructions are provided that enable multi-user interaction with detailed virtual objects of a shared virtual space superimposed for all users identically on a shared physical space. Experiences that can be achieved by users of a multi-user group during such interaction are collectively called a multi-user virtual reality interaction environment experience. During the enabled interaction, users can collaborate and communicate simultaneously within the shared virtual space and the shared physical space. This results in both spaces being interrelated and intuitive to navigate and creates a virtual reality experience combined with physical reality, which is shared by all users. Interactions with the shared virtual space are performed using manually positioned and operated portable interactive devices with trackers, movement of which is tracked in the shared physical space and is used to control individual views of the detailed virtual objects of the shared virtual space.

Description

BACKGROUND[0001]1. Field of the Invention[0002]The present invention relates generally to multi-user interaction with virtual objects of a virtual 3D environment, rather called virtual space, and more particularly to multi-user interaction with detailed virtual objects of a shared virtual space that is at the same time superimposed for all users identically on a shared physical space, creating a multi-user virtual reality interaction environment experience with simultaneous virtual and physical collaboration and communication of users.[0003]The present invention further relates to multi-user interaction with detailed virtual objects of a shared virtual space superimposed on a shared physical space, using portable interactive devices movement of which is tracked in the shared physical space, enabling navigation within the shared virtual space and interaction with detailed virtual objects, acting as virtual space view controlling devices and virtual cursor pointing devices, capable of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): G06T19/00
CPCG06T19/006G06T2219/2016G06T2219/024G06T19/003G06F1/1694G06F3/0346G06F3/04815
Inventor GREK, ANDREJ
Owner GREK ANDREJ
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products