Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A spatial positioning method of virtual reality system based on depth perception

A technology of depth perception and virtual reality, applied in the field of virtual reality, can solve problems such as inequalities in the position of virtual cameras, failure of users, and inability to interact correctly, so as to improve the virtual-real interactive experience and relieve motion sickness

Active Publication Date: 2021-09-21
SOUTH CHINA UNIV OF TECH
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, affected by key issues such as rendering technology and computer image display, as well as differences in the binocular data of different users, the depth information of the specific stereoscopic image perceived by the user and the depth information set by the VR system obtained by viewing the fixed-observation binocular virtual camera If there is a difference, the position of the virtual camera cannot be equal to the position of the human relative to the virtual scene perceived by binocular vision
Even if the posture of oneself in the physical space is correctly positioned and fed back to the VR system in real time, the user cannot correctly perceive its own position in the virtual space, cannot correct the positional relationship between itself and the virtual target, and cannot interact correctly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A spatial positioning method of virtual reality system based on depth perception
  • A spatial positioning method of virtual reality system based on depth perception
  • A spatial positioning method of virtual reality system based on depth perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0061] A spatial positioning method for virtual reality systems based on depth perception, such as figure 2 shown, including the following steps:

[0062] Step S1: Modeling the depth perception of the human visual system in the virtual space based on the visual fixation point, the specific implementation steps are as follows:

[0063] S1.1. Real-time tracking of gaze point:

[0064] At time t, when the human eye passes through the binocular parallax image I displayed on the left and right displays of the VR helmet tl and I tr Gaze at a target point in virtual space When , the binocular line of sight direction is obtained by the line of sight tracking algorithm based on pupil corneal reflection technology, and then the closest point of the two lines of sight in space is solved according to the knowledge of spatial analytic geometry, which is the point of sight fixation; The coordinates of the 3D gaze point in the space coordinate system are The superscript V represents ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a space positioning method for a virtual reality system based on depth perception. The method first tracks the gaze point perceived by the human visual system through the eye tracking system integrated with the VR head-mounted device, and builds a depth perception model when the user gazes at the target object in the virtual environment based on the gaze point; secondly, the gaze point establishes an internal reference for relative visual perception Matrix, so as to calculate the 2D projection position of the 3D gaze point on the image, and on this basis, calculate the target point position of human gaze; then quantify the depth perception difference between the VR system setting and the visual system; finally, the depth perception difference The position of the virtual camera is compensated to obtain the position in the virtual space perceived by the visual system. The present invention considers the depth perception difference of the visual system in the virtual environment, directly locates the user's own perception position in the virtual scene, enables the user to interact more accurately with the virtual object, and improves the user's interactive experience.

Description

technical field [0001] The invention relates to the technical field of virtual reality, in particular to a space positioning method for a virtual reality system based on depth perception. Background technique [0002] Spatial positioning technology in virtual reality system is a key technology to connect physical space and virtual space and realize the interaction between human and virtual scene. [0003] Positioning in virtual reality refers to determining the user's own position in the virtual space. The existing spatial positioning method locates the absolute spatial pose of the user in the physical space, and feeds it back to the VR content, and realizes positioning corresponding to the spatial position of the VR virtual camera. It is mainly divided into two categories: Outside-in and Inside-out spatial positioning methods. The Outside-in method locates the position of the head-mounted display (HMD) through an external device placed in the physical space, and synchroni...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01
CPCG06F3/013G06F2203/012
Inventor 秦华标刘红梅
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products