Virtual reality system space positioning method based on depth perception

A depth perception, virtual reality technology, applied in the field of virtual reality, can solve problems such as inability to interact correctly, inability to correct the relationship between itself and the virtual target position, and inability to position the virtual camera.

Active Publication Date: 2020-04-14
SOUTH CHINA UNIV OF TECH
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, affected by key issues such as rendering technology and computer image display, as well as differences in the binocular data of different users, the depth information of the specific stereoscopic image perceived by the user and the depth information set by the VR system obtained by viewing the fixed-observation binocular virtual camera If there is a difference, the position of the virtual camera cannot be equal to the position of the human relative to the virtual scene perceived by binocular vision
Even if the posture of oneself in the physical space is correctly positioned and fed back to the VR system in real time, the user cannot correctly perceive its own position in the virtual space, cannot correct the positional relationship between itself and the virtual target, and cannot interact correctly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual reality system space positioning method based on depth perception
  • Virtual reality system space positioning method based on depth perception
  • Virtual reality system space positioning method based on depth perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0061] A spatial positioning method for virtual reality systems based on depth perception, such as figure 2 shown, including the following steps:

[0062] Step S1: Modeling the depth perception of the human visual system in the virtual space based on the visual fixation point, the specific implementation steps are as follows:

[0063] S1.1. Real-time tracking of gaze point:

[0064] At time t, when the human eye passes through the binocular parallax image I displayed on the left and right displays of the VR helmet tl and I tr Gaze at a target point in virtual space When , the binocular line of sight direction is obtained by the line of sight tracking algorithm based on pupil corneal reflection technology, and then the closest point of the two lines of sight in space is solved according to the knowledge of spatial analytic geometry, which is the point of sight fixation; The coordinates of the 3D gaze point in the space coordinate system are The superscript V represents ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual reality system space positioning method based on depth perception. The method comprises the steps that firstly, a fixation point perceived by a human vision system istracked through a sight tracking system integrated with VR head-mounted equipment, and a depth perception model when a user fixes at a target object in a virtual environment is constructed based on the fixation point; secondly, an internal reference matrix relative to visual perception is established according to the gaze point so as to calculate the 2D projection positions of the 3D fixation points on the image and calculate the position of the target point gazed by human eyes accordingly; then the depth perception difference set by the VR system and formed the visual system is quantified; and finally the position of the virtual camera is compensated according to the depth perception difference to obtain the position of the virtual camera in the virtual space perceived by the visual system. The deep perception difference of the visual system in the virtual environment is considered, and the perception position of the user in the virtual scene is directly positioned so that the user can interact with the virtual object more accurately, and the interaction experience of the user is improved.

Description

technical field [0001] The invention relates to the technical field of virtual reality, in particular to a space positioning method for a virtual reality system based on depth perception. Background technique [0002] Spatial positioning technology in virtual reality system is a key technology to connect physical space and virtual space and realize the interaction between human and virtual scene. [0003] Positioning in virtual reality refers to determining the user's own position in the virtual space. The existing spatial positioning method locates the absolute spatial pose of the user in the physical space, and feeds it back to the VR content, and realizes positioning corresponding to the spatial position of the VR virtual camera. It is mainly divided into two categories: Outside-in and Inside-out spatial positioning methods. The Outside-in method locates the position of the head-mounted display (HMD) through an external device placed in the physical space, and synchroni...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/013G06F2203/012
Inventor 秦华标刘红梅
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products