Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Eye movement tracking calibration method based on head-mounted eye movement module

An eye-tracking, head-mounted technology, applied in mechanical mode conversion, character and pattern recognition, acquisition/recognition of eyes, etc., can solve the problems of high user requirements and cumbersome process, low requirements, fast and accurate calibration process , Conducive to the effect of use and promotion

Active Publication Date: 2020-02-07
QINGDAO RES INST OF BEIHANG UNIV
View PDF5 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The existing mainstream eye tracking modules require the user to stare at multiple calibration points for user calibration. This calibration method requires the user to concentrate on looking at the points on the screen. The process is cumbersome and has high requirements for users.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Eye movement tracking calibration method based on head-mounted eye movement module
  • Eye movement tracking calibration method based on head-mounted eye movement module
  • Eye movement tracking calibration method based on head-mounted eye movement module

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The present application will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0023] Such as figure 1 As shown, the user wears an eye movement module to watch the video displayed on the screen. The eye movement module includes two cameras, namely the eye camera that captures the user's eyes and the world camera that captures the display screen.

[0024] combine Figure 1 to Figure 3 , the eye-tracking calibration method based on the head-mounted eye-movement module, comprising the following steps:

[0025] An eye tracking calibration method based on a head-mounted eye tracking module, characterized in that: comprising the following steps,

[0026] (1) The user wears the above-mentioned eye movement module;

[0027] (2) Play a calibration video containing a moving object with a frame number of 50 to 300 frames on the screen. The user watches the calibration video, and two cameras capture the user's eyes and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Disclosed is an eye movement tracking calibration method based on a head-mounted eye movement module. A head-mounted module with two cameras is used, and the module is automatically calibrated through a calibration video played by a certain user watching a screen, so that the design purpose of reducing the requirements for the user and the complexity of the calibration process and improving the calibration efficiency are achieved. The method comprises the following steps: (1) enabling a user to wear an eye movement module on the head, and the eye movement module comprises two cameras which are respectively an eye camera for shooting the eyes of the user and a world camera for shooting a display screen; (2) playing a calibration video containing a moving object on a screen, wherein a userwatches the calibration video; (3) obtaining a saliency map set corresponding to the calibration video by using a video saliency detection method, and calculating the position of the moving object inthe screen according to the saliency map set and the position of the video window; (4) calculating user gaze vectors corresponding to the multiple frames of eye images by using an uncalibrated eye movement tracking algorithm; and (5) calculating a mapping relationship between the gaze vector and the screen in combination with the multiple frames of calibration videos and the gaze vector and the screen image corresponding to the multiple frames of calibration videos so as to calculate a screen gaze point corresponding to the gaze vector, thereby realizing automatic calibration.

Description

technical field [0001] The invention relates to a head-mounted eye movement module and an automatic calibration method for eye movement tracking algorithms based on the module, belonging to the fields of computer vision and computer graphics. Background technique [0002] With the rapid development of eye tracking technology, the attention of eye tracking in the field of computer vision is also increasing. At present, most head-mounted eye-tracking modules need to be calibrated for each user before use. There are two reasons for the calibration: First, most head-mounted eye movement modules can only estimate the user's gaze vector, but using the eye movement module actually needs to calculate the gaze point. To calculate the gaze point, you need to know the gaze vector to The mapping relationship of the screen, the mapping relationship is calculated by the calibration process; secondly, the current mainstream head-mounted eye movement module can only estimate the optical ax...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06T7/246
CPCG06F3/013G06T7/246G06T2207/10016G06V40/193Y02D10/00
Inventor 陆峰蒋雨薇于洋
Owner QINGDAO RES INST OF BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products