Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning

A human eye positioning and human eye technology, applied in the field of line of sight estimation based on human eye positioning, can solve problems such as rarely achieving real-time effects, and achieve the effect of overcoming large-scale head movement and fast line of sight direction

Active Publication Date: 2015-07-08
世迈嘉医疗科技(丽水)有限公司
View PDF5 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In addition, the existing human eye positioning and line of sight estimation methods rarely achieve real-time results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning
  • Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning
  • Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] In a preferred embodiment of the present invention, Figure 4 For the human face image 1 shown (width is w, height is h), perform eye positioning, and find the position and radius of the left eye and right eye on the image.

[0045] See figure 1 , The human eye positioning method of the present invention includes the following steps:

[0046] Step 1: Construct two types of convolution kernels. The difference between the two types of convolution kernels is that the weights at the center of the convolution kernels are different. The convolution kernel K r Has a central weight value, the convolution kernel K’ r The center weight is 0, and each type of convolution kernel has a convolution kernel with a different radius.

[0047] In this embodiment, 0.1w convolution kernels K are constructed r , Which are all convolution kernels with circular boundaries, with different radii r, the maximum value r among these radii r max And minimum r min They are 0.2w and 0.1w respectively. figu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human eye positioning method. The human eye positioning method includes the steps of constructing two kinds of convolution kernels, wherein each kind of of convolution kernels comprises convolution kernels with different radii; cutting out eye area image parts on human face images, conducting convolution on the two kinds of convolution kernels with the different radii and the eye area images so as to obtain convoluted images, and dividing the two kinds of convoluted images so as to obtain a convolution factor matrix. The centers and the radii of the human eyes are obtained by calculating the position corresponding to the maximum value position of the factor matrix and the radii of the convolution kernels. The invention further discloses a sight estimation method based on human eye positioning. Human face feature points and head orientations of the human face images are calculated, so that the sight direction is finally determined in cooperation with human eye center coordinates, canthus coordinates and the head orientations. According to the human eye positioning method and the sight estimation method, the convolutions are calculated through the Fourier transformation, the gray values at the positions of the centers of the eyes are considered, interference of eyebrows, eyelids, glasses and illumination in positioning is reduced, the centers and the radii of the human eyes can be rapidly and accurately positioned, and the sight directions of the human eyes can be rapidly and accurately estimated.

Description

Technical field [0001] The present invention relates to the field of machine vision and image processing, in particular to a fast and accurate eye positioning method and a line of sight estimation method based on human eye positioning. Background technique [0002] Human eye positioning technology is an important part of computer vision applications. With the rapid development of computer technology, human eye positioning is often used in the fields of face positioning, iris recognition, eye disease detection, gaze tracking, human-computer interaction and helping the disabled. [0003] In the field of iris recognition, locating the center and boundary of the human eye is a critical step, which directly affects the accuracy of subsequent recognition. At present, there are two most classic methods for locating the human eye in this field. One is the differential integral operator based on the circle boundary proposed by Daugman; the other is the detection boundary of the image first...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
Inventor 刘洪海蔡海滨张剑华陈胜勇朱向阳
Owner 世迈嘉医疗科技(丽水)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products