Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Eye-movement tracking method and system based on recalling and annotation

An eye-tracking and mouse technology, applied in the fields of visual cognition, social computing, and human-computer interaction, can solve the problems of complex use process, high software and hardware costs, and difficulty in promotion, and achieve high acquisition accuracy, low economic cost, and comfort. high degree of effect

Active Publication Date: 2016-11-23
ZHEJIANG UNIV OF TECH
View PDF2 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, this type of method mainly has the following defects: First, the eye-tracking system has a complex structure, high hardware and software costs, and is difficult to popularize.
Second, the use process of the eye tracking system is complicated, and the user must perform it under the guidance of professional technicians, and before collecting data, the user needs to perform the data calibration process, and during the data collection process, the user also needs to minimize head movement
Third, the data collection efficiency of the eye-tracking system is low. An eye-tracking system can only collect eye-movement data of one user at a time. In addition, the use process is complicated, and it is impossible to collect a large number of effective data in a short period of time. eye movement data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Eye-movement tracking method and system based on recalling and annotation
  • Eye-movement tracking method and system based on recalling and annotation
  • Eye-movement tracking method and system based on recalling and annotation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] A method and system for eye movement tracking based on recall and labeling of the present invention will be clearly and completely described below in conjunction with the accompanying drawings. Apparently, the described embodiments are only some of the embodiments of the present invention, not all of them. . Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0026] see figure 1 , a basic flow diagram of an eye movement tracking method based on recall and labeling provided by an embodiment of the present invention, which mainly includes steps:

[0027] (1) Define the way for users to recall and label, and set corresponding tasks. There are three ways of recalling and labeling. Method 1, the user observes the image stimulus source, and then hides the image stimulus source, the user recalls the position of the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an eye-movement tracking method based on recalling and annotation. The eye-movement tracking method comprises the following steps: (1) defining a recalling and annotation manner of a user and setting a corresponding task; (2) representing a task instruction on a display screen and publishing the task to the user; (3) presenting an image stimulating source on the display screen; (4) allowing the user to observe the image stimulating source on the display screen according to task requirements; (5) allowing the user to recall fixation points generated in a task executing process according to the task requirements and annotate coordinate positions and a sequence of the fixation points on the display screen; if entering a training mode, skipping to step (6); if entering a normal mode, skipping to step (8); (6) not representing the image stimulating source, which is represented to the user by the training mode, to the user in the normal mode; (7) evaluating training performances of the user; and (8) when the user enters the normal testing mode, recording andstoring fixation point positions annotated by the user and sequence data thereof. The invention further provides a system utilizing the method.

Description

technical field [0001] The present invention relates to the fields of human-computer interaction, visual cognition, social computing, etc., and in particular to an eye tracking method and system based on memory and annotation. Background technique [0002] At present, in many scientific researches and commercial applications, such as psychology, advertising design, ergonomics, etc., people need to study the user's eye movement behavior when observing things, such as the position of the fixation point and the trajectory of the movement, for further analysis of visual attention space-time distribution. [0003] Existing methods mainly use optical eye movement tracking methods to obtain eye movement data such as gaze points. The main process is to use a camera device to capture human eye images, use image processing methods to extract eye image related features, and then establish eye image features. The mapping relationship with the eye movement behavior is finally calculated...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/013G06V40/19
Inventor 程时伟
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products