Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target tracking method and system based on eye movement tracking and storage medium

An eye-tracking and target technology, applied in the visual field, can solve problems such as untargeted, high power consumption, and inability to clearly define the gaze point, so as to improve the comfort of use and expand the market.

Active Publication Date: 2021-08-13
NORTHWEST UNIV(CN)
View PDF7 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Eye tracking can reflect the positional relationship between eye movements and fixation changes, but it cannot clearly identify what the fixation point is. It needs subsequent manual identification and judgment to analyze the athlete's physical condition and understand the athlete's psychological change process in order to formulate A more scientific training method
[0003] Applying target detection to the VR / AR environment can complete object recognition and positioning, but the current target detection algorithm is to locate and classify all interested objects in the image, which consumes high power, is not targeted, and will generate a lot of irrelevant information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target tracking method and system based on eye movement tracking and storage medium
  • Target tracking method and system based on eye movement tracking and storage medium
  • Target tracking method and system based on eye movement tracking and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0057] The target tracking method step of the present embodiment is:

[0058] Step 1, the human eye area video stream I captured in real time by the infrared camera module t , the foreground video stream G that can be seen by the human eye captured by the wide-angle camera module t ;

[0059] In this embodiment, α=1280 / 640=2, ω=720 / 360=2, pr°=5°, Dis=1000 pixels, and β is specifically set to be 0.3; figure 1 (a1) is I t One frame of the human eye image, and at the same time collect the foreground video stream G that the human eye can see through the wide-angle camera t , figure 2 (b1) is G t The foreground image of the same sequence frame as (a1);

[0060] Step 2, using the literature "Wang Peng, Chen Yuanyuan, Shao Minglei, et al. Smart Home Controller Based on Eye Tracking [J]. Journal of Electrical Machinery and Control, 2020, v.24; No.187(05):155-164. The method disclosed in " human eye region video stream I t Perform pupil center detection in each frame to obtain...

Embodiment 2

[0065] The difference between this embodiment and embodiment 1 is: the specific steps of pupil center detection method in step 2 are:

[0066] The video stream of the human eye area I t As input, the pyramid LK optical flow method (Bouguet J Y. Pyramidalimplementation of the Lucas Kanade feature tracker. Opencv Documents, 1999.) is used to estimate the motion state of the eyes, and the video stream I of the human eye area is eliminated. t In the blink state frame and the frame of the nystagmus state, the output removes the video stream I that is in the blink state and the nystagmus state frame t '; in nystagmus, the current vector is very small, and in this embodiment, the light loss of adjacent frames is taken as nystagmus when it is less than 100 pixels; in blinking, the current vector is particularly large, and in this embodiment, the light loss of adjacent frames is taken Blink when the amount of loss is greater than 6000 pixels;

[0067] Step 2.1.2, remove the video str...

Embodiment 3

[0069] The difference between this embodiment and embodiment 2 is that step 3 is specifically:

[0070] Processed as foreground video stream G by perceptual hashing algorithm t Generate a "fingerprint" string for each frame; compare the "fingerprint" string information of adjacent frames to judge the similarity of adjacent frames, if the similarity exceeds 98%, directly use the target detection result of the previous frame; if The similarity does not exceed 98%, and the object detection is carried out; the object detection method in this embodiment adopts the YOLOv4 method, for details, please refer to the document "Bochkovskiy A, Wang C Y, Liao H.YOLOv4: Optimal Speed ​​and Accuracy of Object Detection[J].2020. The method disclosed in ".

[0071] The scheme of embodiment 1-3 is carried out real-time performance analysis by frame rate (general frame rate reaches 15 frames / second when human eyes seem to be continuous, then thinks that basic real-time, the frame rate is larger,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a target detection method and system based on eye movement tracking and a storage medium. The disclosed scheme comprises the following steps: acquiring eye movement information and target detection information, and determining a display center visual area of human eyes in a foreground image according to the eye movement information; and then judging and displaying target detection information at the central visual area, and outputting the predicted position and predicted category information of the object at the central visual area. According to the method and system, the eye movement tracking technology and the target detection technology are combined, the target in the human eye watching area is detected in a targeted mode, and user interest points are conveniently obtained.

Description

technical field [0001] The invention belongs to the field of visual technology, and in particular relates to an eye tracking-based target tracking method. Background technique [0002] Eye tracking can reflect the positional relationship between eye movements and fixation changes, but it cannot clarify what the fixation point is. It needs subsequent manual identification and judgment to analyze the athlete's physical condition and understand the athlete's psychological change process, in order to formulate A more scientific training method. [0003] Applying target detection to the VR / AR environment can complete object recognition and positioning, but the current target detection algorithm is to locate and classify all interested objects in the image, which consumes high power, is not targeted, and will Generate a lot of irrelevant information. Contents of the invention [0004] Aiming at the deficiencies in the prior art, the present invention provides an object detecti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/19G06V40/193G06V2201/07G06F18/2414
Inventor 彭进业邓乐玲赵万青李斌彭先霖胡琦瑶张晓丹王珺
Owner NORTHWEST UNIV(CN)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products