Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A mobile control method for virtual reality

A technology of virtual reality and control method, applied in the field of virtual reality, can solve the problems of the difference between visual perception and physical perception, the perceived information is not synchronized, the user is dizzy, etc. Effect

Active Publication Date: 2019-05-17
HANGZHOU SHAOZI NETWORK TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the user's movement range and scenes in the virtual environment are not limited by this type of movement, the user can visually perceive the movement and action, but the vestibular organ of the body that perceives movement does not perceive the corresponding movement, so visual perception and There are differences between body perceptions, and the perceived information is not synchronized, which makes the user experience strong dizziness
[0009] To sum up, several mobile control methods have more or less various defects, especially the problem of dizziness, which prevents users from fully engaging in mobile actions in the virtual environment.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A mobile control method for virtual reality
  • A mobile control method for virtual reality
  • A mobile control method for virtual reality

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] A mobile control method for virtual reality, said method comprising the following steps:

[0049] S1 In the virtual reality environment, several target coordinate points are set up, and the visually visible target coordinate point is a single one;

[0050] S2 uses the angular velocity sensor to track the rotation angle of the human head, and realizes the control of the visual orientation in the virtual reality environment by turning the head;

[0051] S3 uses the direction cone angle obtained in the visual direction for a certain target coordinate point to control the movement.

[0052] The control of movement refers to accelerating forward, running at a constant speed, and decelerating to stop. The setting of the target coordinate point can be a specific point or line segment or a surface with a boundary.

Embodiment 2

[0054] As a specific or preferred version of embodiment 1. The process of obtaining the direction cone angle in step 3 is to monitor the displayed screen frame by frame of the user's visual orientation, on the line between the user's current coordinate point and the target coordinate point, that is, the direction toward the target coordinate point Direction, set the direction cone angle, the apex of the cone is the user's current coordinate point, and the bottom surface faces the target coordinate point.

[0055] Such as image 3 As shown, when the visual orientation deviates from the target coordinate point, m is the unit vector in the direction from the user's current coordinate point to the target coordinate point, n is the unit vector in the visual orientation direction, and the direction cone angle is defined by m and n. When If the direction cone angle is m n > 0.6, it means that the user continues to move towards the target point, so he continues to walk at the current...

Embodiment 3

[0058] As an embodiment of embodiment 1 or 2, step S3 in embodiment 1 includes the following steps:

[0059] A is determined to move to a certain target coordinate point;

[0060] B The user starts to move, accelerates according to the preset first acceleration curve, and the acceleration increases with time until the speed reaches the preset maximum speed value, and then moves at a constant speed according to the preset maximum speed value;

[0061] C When the distance between the user's current location point and the target coordinate point is less than the preset distance value, use the preset first deceleration curve to decelerate. The deceleration speed increases with time, and the speed drops to zero at the same time. or reach the target coordinate point.

[0062] In order to illustrate the change of moving speed more clearly, the following is combined with the attached figure 2 Be specific. For example, when the user moves from the current coordinate point to the ta...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a moving control method of virtual reality. The method includes the steps of S1, setting a plurality of target coordinate points in a virtual reality environment, wherein one coordinate point is currently visually visible; S2, using an angular velocity sensor to trace the rotating angle of a human head, and controlling the vision direction in the virtual reality environment by rotating the head; S3, controlling movements according to direction cone angle obtained in the vision direction of one target coordinate point. The moving control method has the advantages that the movements are control through the vision only, a user cannot feel dizziness easily, moving range is large, moving freedom degree is extremely high, and stopping can be achieved at any time during moving so as to observe the ambient environment; other auxiliary equipment and instruments are not needed, and only existing head display equipment with a gyroscope is needed; limitation on the size of the real space where the user is located is avoided, high immersion sense is achieved, the movements are close to the human moving manner in reality, and extremely high immersion sense and reality are achieved.

Description

technical field [0001] The invention belongs to the technical field of virtual reality, and in particular relates to a movement control method of virtual reality. Background technique [0002] Virtual reality technology is a computer simulation system that can create and experience a virtual world. It uses a computer to generate a simulated environment. It is a system simulation of interactive 3D dynamic vision and entity behavior that combines multi-source information to immerse users. into that environment. Virtual reality technology (VR) mainly includes aspects such as simulated environment, perception, natural skills and sensing equipment. Among them, the simulated environment is a real-time dynamic three-dimensional realistic image generated by a computer; perception means that an ideal VR should have the perception of all people, in addition to the visual perception generated by computer graphics technology, there are also hearing, touch, force Perceptions such as se...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06T19/00
CPCG06F3/011G06F3/012G06T19/00
Inventor 胡旭超
Owner HANGZHOU SHAOZI NETWORK TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products