Accelerated gaze-supported manual cursor control

a cursor control and accelerated technology, applied in the field of ubiquitous user interfaces, can solve the problems of increasing the problem of reliably referring to small and closely positioned targets, affecting the user's experience of virtual environments, and affecting the ability to appropriately combine multimodal inputs, so as to improve user interaction with virtual environments

Inactive Publication Date: 2019-11-28
MICROSOFT TECH LICENSING LLC
View PDF6 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0003]In some embodiments, a method for improving user interaction with a virtual environment includes measuring a first position of a user's gaze relative to the virtual environment, receiving a system engagement input, presenting a guidance cursor at the first position, receiving a target engagement input and decoupling the guidance cursor from the user's gaze, receiving a movement input, and translating the guidance cursor based on the movement input.

Problems solved by technology

Several problems arise with this approach though, as eye tracking and additional commands are asynchronous (i.e., the eye gaze is usually preceding manual inputs and may have moved on to new targets upon finishing recognition of the manual input).
In addition, due to technological constraints of the tracking system as well as physiological constraints of the human visual system, the computed gaze signal may be jittery and show offsets compared to the actual eye gaze.
This increases the problem of reliably referring to small and closely positioned targets.
Thus, an overall problem arises about how such multimodal inputs can be appropriately combined.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Accelerated gaze-supported manual cursor control
  • Accelerated gaze-supported manual cursor control
  • Accelerated gaze-supported manual cursor control

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026]This disclosure generally relates to devices, systems, and methods for visual user interaction with virtual environments. More specifically, the present disclosure relates to improving interaction with virtual elements using gaze-informed manual cursor control. In some embodiments, visual information may be provided to a user by a near-eye display. A near-eye display may be any display that is positioned near a user's eye, either to supplement a user's view of their surroundings, such as augmented or mixed reality devices, or to replace the user's view of their surroundings, such as virtual reality devices. In some embodiments, an augmented reality or mixed reality device may be a head-mounted display (HMD) that presents visual information to a user overlaid on the user's view of their surroundings. For example, the visual information from the HMD may be combined with ambient or environment light to overlay visual information, such as text or images, on a user's surroundings.

[...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A method for improving user interaction with a virtual environment includes measuring a first position of a user's gaze relative to the virtual environment, receiving a system engagement input, presenting a guidance cursor at the first position, receiving a target engagement input and decoupling the guidance cursor from the user's gaze, receiving a movement input, and translating the guidance cursor based on the movement input.

Description

BACKGROUNDBackground and Relevant Art[0001]With emerging ubiquitous user interfaces (UI), such as smart devices and innovative head-mounted display technology, usage of such UIs becomes more common among non-specialists. Interaction with the UIs may be improved by making the interaction more intuitive and subtle. A well-established input paradigm is point-and-click or in more general terms: point-and-command. In emerging natural UIs, a command could for instance be triggered by different voice commands, hand gestures, or touch input.[0002]An effortless and subtle way to indicate a user's context is to take advantage of gaze tracking data to infer a user's current reference frame. Several problems arise with this approach though, as eye tracking and additional commands are asynchronous (i.e., the eye gaze is usually preceding manual inputs and may have moved on to new targets upon finishing recognition of the manual input). In addition, due to technological constraints of the trackin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F3/01G06F3/0481G06F3/0484
CPCG06F3/04842G06F3/013G06F3/04812G06F3/017G06F2203/0381
Inventor STELLMACH, SOPHIEMEEKHOF, CASEY LEONTICHENOR, JAMES R.LINDSAY, DAVID BRUCE
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products