Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Eye movement interaction method, head-mounted device and computer readable medium

An interactive method and eye movement technology, applied in computer components, computing, input/output of user/computer interaction, etc., can solve problems such as jumping, unreliable eye tracker signals, limited eye movement interaction, etc., and achieve efficiency high effect

Active Publication Date: 2020-09-29
TSINGHUA UNIV
View PDF15 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the scenes and functions that the eye tracker can use are very limited, mainly due to the following points: 1. The tracking accuracy of the eye tracker is not very high, generally there will be an error of 0.5-2°, and the user is using the eye tracker to perform The movement of the eyeball is not so stable during the interaction, which leads to the unreliable signal obtained by the eye tracker, which is often accompanied by irregular beating and a lot of noise; 2. The user's eyes are always open in the use of VR and AR. Yes, how to judge the user's eyes to perform an intentional interaction at a certain moment is a relatively big challenge
[0004] At present, eye movement interaction in VR and AR is very limited: 1. Eye movement is usually used as an implicit input for user interaction, such as using eye movement to determine which area the user is paying attention to, so that specific modifications to that area make the user You can use your hands or controllers to input better; 2. You can use your eyes to point or track objects in the interface to select menus or select objects, but this kind of interaction cannot be popularized in practical scenarios because it is impossible to judge whether users use VR and Whether the natural eye movement of AR is selecting the target, leading to the problem of false triggering; 3. Using the eyes to stay or draw a specific trajectory, these operations are not easy to be falsely triggered in the user's natural eye movement, but the input efficiency of these interactions is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Eye movement interaction method, head-mounted device and computer readable medium
  • Eye movement interaction method, head-mounted device and computer readable medium
  • Eye movement interaction method, head-mounted device and computer readable medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] Specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings.

[0039] Before the detailed description, explanations of relevant terms are given.

[0040] Gaze tracking: Cameras in head-mounted displays track the user's gaze. Gaze tracking can be used as a new input axis; for example, to target enemy aircraft in aerial combat games. For example, FOVE is an HMD launched on Kickstarter that may introduce gaze tracking and a foveated rendering SDK. While eye tracking is not a requirement for foveated rendering, it can significantly improve rendering by translating high-detail areas based on the user's gaze direction. Additionally, new users often have difficulty suppressing the natural tendency to look around. The problem is that HMD optics tend to work best when the user is looking directly at the center of the screen through it, and the user is best looking around by turning their head. Gaze tracking is t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides an eye movement interaction method for a virtual reality and / or augmented reality head-mounted device, a head-mounted device, and a computer readable medium, the eye movement interaction method comprising: displaying a virtual reality scene on a screen without displaying a menu; tracking the sight of the user; judging whether the user performs a menu triggering eventthrough eye movement or not according to the tracked sight line; and under the condition that the user performs the menu triggering event, displaying the menu. The eye movement interaction method canalso comprise the following steps: judging whether a user performs a menu selection event or not according to the tracked sight line under the condition of displaying a menu; and under the conditionof determining that the user performs the menu selection event, executing a command corresponding to the selected menu item, and hiding the menu. The menu in the eye movement interaction method of thehead-mounted device is invisible at ordinary times, the view of the user is not occupied, the menu appears only when the user wants to call out the menu, and the effectiveness, accuracy and robustness of the menu triggering technology and the selection technology are verified through experiments.

Description

technical field [0001] The present invention generally relates to virtual reality and / or augmented reality headsets, and more particularly to eye tracking based interaction methods and headsets. Background technique [0002] With the development of virtual reality (VR) and augmented reality (AR) technologies, these head-mounted devices that can render virtual objects have entered the lives of the masses, are used by a wide range of user groups, and are applied in various fields, such as games , education, medical care, special training, etc. Examples of virtual reality devices are e.g. Figure 9 The virtual reality helmet shown and Figure 10 Augmented reality helmet shown. [0003] Eye trackers can be deployed in existing head-mounted displays to track user eye movement and gaze position in VR and AR. However, the scenarios and functions that eye trackers can use are very limited, mainly due to the following points: 1. The tracking accuracy of eye trackers is not very hi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F3/0482G06F3/0484
CPCG06F3/013G06F3/0482G06F3/04847
Inventor 易鑫史元春鲁逸沁王运涛
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products