Eye tracking method and system driven by multi-model fusion for mobile devices

A mobile device and eye-tracking technology, applied in biological neural network models, mechanical mode conversion, acquisition/recognition of eyes, etc., can solve problems such as weak CPU processing power, low camera resolution, and limited hardware conditions, and achieve Improve the accuracy of eye tracking, reduce the amount of calculation, and avoid the effect of parameter adjustment

Active Publication Date: 2022-06-21
ZHEJIANG UNIV OF TECH
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current problems of mobile devices include: (1) limited hardware conditions, such as weak CPU processing power, low camera resolution, and small memory capacity; (2) complex use environment and large lighting changes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Eye tracking method and system driven by multi-model fusion for mobile devices
  • Eye tracking method and system driven by multi-model fusion for mobile devices
  • Eye tracking method and system driven by multi-model fusion for mobile devices

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following will clearly and completely describe the multi-model fusion-driven eye tracking method for mobile devices of the present invention with reference to the accompanying drawings. Obviously, the described examples are only a part of the examples of the present invention, not all of them, and it is incomprehensible to limit the present invention. Based on the examples in the present invention, all other examples obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

[0025] see figure 1 , a mobile device-oriented multi-model fusion eye tracking method proposed by an example of the present invention includes the following steps:

[0026] (1) Eye movement feature analysis based on appearance model;

[0027] First, prepare the data set, and use the synthesized human eye image to perform preprocessing operations such as scaling, grayscale, and filtering, and then convert it into a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A multi-model fusion-driven eye tracking method for mobile devices, including the following steps: (1) eye movement feature analysis based on appearance model; (2) eye movement data mapping based on feature model; (3) eye movement tracking method based on multi-model fusion Moving gaze calculations. The present invention also provides a mobile device-oriented multi-model fusion-driven eye movement tracking system, including the following modules connected in sequence and feeding data: an eye movement feature analysis module based on an appearance model; an eye movement data mapping module based on a feature model; Eye movement fixation calculation module for model fusion. The invention expands the current eye movement tracking method on the mobile device, and improves the eye movement tracking accuracy, calculation speed and stability of the mobile device in a complex interactive environment.

Description

technical field [0001] The present invention relates to an eye tracking method and system. Background technique [0002] Commonly used eye tracking methods mainly include appearance-based model and feature-based model: the appearance model-based eye tracking method inputs the appearance image of the human eye, and then constructs a convolutional neural network as the appearance model to extract the uninterpretable hidden features of the human eye image. The eye tracking method based on the feature model inputs clear and interpretable human eye image features, preprocesses the human eye image, extracts image features, and then establishes image features and eye gaze points The mapping equation between is used for eye movement fixation calculation. The advantages and disadvantages of the two types of methods are as follows: The advantage of the eye tracking method based on the appearance model is that it adopts a neural network model, which is less affected by ambient lightin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06V10/80G06V40/16G06V40/18G06V10/774G06V10/82G06K9/62G06N3/04
CPCG06F3/013G06V40/165G06V40/19G06N3/045G06F18/25G06F18/214
Inventor 程时伟张章伟
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products