Joint sight line direction calculation method of left and right eye images of human eyes

A technology of sight direction and calculation method, which is applied in the field of computer vision and image processing, can solve the problem of monocular image noise, etc., and achieve the effect of solving inaccurate prediction results

Active Publication Date: 2018-01-05
BEIHANG UNIV
View PDF2 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technology of the present invention solves the problem: overcomes the deficiencies of the prior art, and provides a method for calculating the line of sight direction of the left and right eye images of the human eye, by using the neural network to extract the information factors contained in the image, and adjusting the neural The network model finally predicts the gaze direction of both eyes, mainly by combining the image information of both eyes, so as to solve the problem of large noise in the input monocular image in the appearance-based gaze tracking method, thus realizing high-precision 3D gaze direction prediction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Joint sight line direction calculation method of left and right eye images of human eyes
  • Joint sight line direction calculation method of left and right eye images of human eyes
  • Joint sight line direction calculation method of left and right eye images of human eyes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The specific implementation of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0035] The present invention provides a line-of-sight calculation method for the combination of left and right eye images of human eyes, which inputs the information characteristics of human eyes to predict the line-of-sight direction of people's eyes. Information features used in . This method has no additional requirements on the system, and only uses the human eye image captured by a single camera as input. At the same time, by combining the image information of both eyes, the present invention can eliminate some error situations with relatively large monocular noise, thereby achieving better robustness than other similar methods.

[0036] First, for the acquisition of human eye images, the present invention includes the following procedures. Using a single camera, capture an image that contains areas of the user's face. Use existin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a joint sight line direction calculation method of left and right eye images of human eyes. The method includes: a two-eye information extraction model, in which a human eye image is input, and left eye and right eye information features contained in the image are respectively and automatically extracted through the two-channel model; and a joint human-eye information feature extraction model, in which two-eye images of a user are input, and two-eye information is combined to extract joint human-eye information features by the model. According to a joint algorithm provided by the invention, a three-dimensional sight line direction is calculated and obtained through inputting feature information. One of the applications of the method is virtual reality and human-machine interaction, and a principle thereof is to calculate the sight line direction of the user through shooting the eye image of the user, and thus carry out interaction with an intelligent system interface or a virtual-reality object. The method can also be widely used in the fields of training and cultivating, games and entertainment, video surveillance, medical care and the like.

Description

technical field [0001] The invention relates to the fields of computer vision and image processing, in particular to a method for calculating the line of sight direction of human left and right eye images. Background technique [0002] Gaze tracking / eye tracking is of great significance for user behavior understanding and efficient human-computer interaction. More than 80% of human perceptible information is received by the human eye, and more than 90% of it is processed by the visual system. Therefore, line of sight is an important clue to reflect the interaction process between people and the outside world. In recent years, due to the rapid development of virtual reality technology and human-computer interaction technology, the application value of gaze tracking technology has gradually become prominent; on the other hand, gaze direction calculation is still a very challenging problem in the field of computer vision. [0003] The current eye-tracking technology is fundam...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08G06F17/18G06K9/00
CPCA61B3/113
Inventor 陆峰陈小武赵沁平
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products