Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Eye movement data-based user help information automatic triggering apparatus and method

A technology that automatically triggers and assists information. It is applied in the input/output process of data processing, program control devices, and electrical and digital data processing. It can solve the problems of affecting user operation experience, difficult to automate integration, and low display efficiency. , to achieve the objective effect of improving user experience, improving efficiency and effect, device and method

Inactive Publication Date: 2017-02-22
NAVY MEDICINE RES INST OF PLA
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The above two main information display driving modes have the following problems for information display: (1) User help information, in the process of information system design and development, is difficult to be automatically integrated into the specific task execution process, so , the task is difficult to drive the trigger display of the user's help information; (2) In the user active driving mode, when the user performs a complex human-computer interaction task and the user's mental load and cognitive load are high, the user actively drives the help information The efficiency of the display is not high, and it will interfere with the main task, thereby affecting the user's operating experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Eye movement data-based user help information automatic triggering apparatus and method
  • Eye movement data-based user help information automatic triggering apparatus and method
  • Eye movement data-based user help information automatic triggering apparatus and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] Taking the information display of office software installed on a desktop computer terminal as an example, the method for automatically triggering user help information based on eye movement data is described. The steps are as follows:

[0040] Step 1. Start the eye tracking device, the operation object information display terminal, the operation object system and related supporting equipment, connect each part reliably through cables, and the user starts to perform the operation task on the operation object system.

[0041] Step 2. The eye-tracking device collects eye-movement trajectories and stores the generated eye-movement data:

[0042] Step 2.1, the eye movement tracking device starts to collect the user's eye movement trajectory, generates and stores the eye movement data, and the generated eye movement data is a data sequence {(x 1 ,y 1 ),(x 2 ,y 2 ),...,(x i ,y i ),...}, where x is the abscissa of each point in the user's eye track, y is the ordinate of ea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of human factors engineering, in particular to an eye movement data-based user help information automatic triggering apparatus. The apparatus comprises an eye movement tracking apparatus, an automatic triggering module arranged in an operation system and connected with the eye movement tracking apparatus, and a display terminal connected with the automatic triggering module. Meanwhile, the invention furthermore provides an eye movement data-based user help information automatic triggering method. The method comprises the steps that the eye movement tracking apparatus acquires an eye movement track of a user and generates an eye movement data sequence; the automatic triggering module calculates a real-time movement distance of the eyes of the user according to the eye movement data sequence and a preset threshold distance; and when the real-time movement distance of the eyes of the user is smaller than or equal to the preset threshold distance for multiple continuous times, the display terminal presents user help information. According to the apparatus and the method, the cognition of the user is assessed by counting a value relationship between the real-time movement distance of human eyes and the threshold distance for multiple times, so that the adaptability of man-machine interaction is remarkably improved and the efficiency and effect of user help information display are improved.

Description

technical field [0001] The present invention relates to the technical field of human factors engineering, in particular to a device and method for automatically triggering user help information based on eye movement data. Background technique [0002] At present, the driving of system information display in the process of human-computer interaction mainly includes two modes: task-driven mode and user-driven mode. In the task-driven mode, the task process expected to be completed by the user drives the information display sequence and duration of each link. For example, in the personal email system, when the user executes the subtask of a specific operation link during the process of sending E-mail After completion, the system will display the interactive interface and information corresponding to the next subtask driven by the task; in the process of active driving by the user, the user will pass a specific operation based on the comprehensive perception and judgment of the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06F3/0484G06F3/0487G06F9/44
CPCG06F3/013G06F3/0484G06F3/0487
Inventor 王川张建李晓军燕锐秦晋于雷
Owner NAVY MEDICINE RES INST OF PLA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products