Human eye positioning and human eye state recognition method

A state recognition, human eye positioning technology, applied in character and pattern recognition, instruments, computer parts and other directions, can solve the problems of low accuracy, large external interference, low gray value, etc., to improve the accuracy and speed, The effect of narrowing the detection range and reducing the complexity

Inactive Publication Date: 2007-10-24
SOUTH CHINA UNIV OF TECH
View PDF0 Cites 93 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

After summarizing these research methods, there are mainly the following categories: (1) template matching, which is to use the eye template to match in the image, and get the point with the largest similarity as the eye for positioning. This method has a large amount of computation and is difficult to use in real time. Eye tracking, and the weighting factor in the template is difficult to set reasonably, it is difficult to take into account the faces under all conditions at the same time, and the accuracy rate is low
(2) Edge extraction, which locates the human eye by detecting the circular feature of the pupil or the elliptical feature formed by the eyelid. This method has high requirements for the pixel accuracy of the human eye area, and when the eyes are closed, the edge cannot be extracted to make the detection fail
(3) Gray scale distribution, according to the characteristics of the eye area: the eye area is darker than the surrounding area, that is, the gray value is low; the gray scale change rate of the eye area is large, and the horizontal and vertical gray scale is performed on the face Projection is used to segment the eyes. This method has a fast positioning speed, but the distribution of peaks and valleys is very sensitive to different faces and posture changes, and the positioning accuracy is poor.
(4) Blink detection, through the difference between image frames, according to the change of eye shape to capture the blink, so as to locate the human eye, but the relatio

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human eye positioning and human eye state recognition method
  • Human eye positioning and human eye state recognition method
  • Human eye positioning and human eye state recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In conjunction with the accompanying drawings, the block diagram of the human eye positioning and human eye state recognition method proposed by the present invention is as shown in Figure 1, and the specific implementation steps are as follows:

[0036] Step 1: face image preprocessing;

[0037]Step 2: Coarse positioning of human eyes;

[0038] Step 3: Precise positioning of the human eye;

[0039] Step 4: Human eye state recognition.

[0040] Wherein the specific implementation steps of Step 1 are:

[0041] The first step is to grayscale the face color image after positioning;

[0042] The camera collects color images with R, G, and B components. Based on the needs of subsequent image processing, the color images need to be converted into grayscale images. The color of each pixel in the grayscale image only has 256 grayscale levels from black to white, and there is no color. The conversion format is as follows: v=0.259R+0.587G+0.144B; where v represents the conver...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an eye positioning and eye state recognizing method, wherein first setting an self-adaptive threshold value to pretreat a face image, according to the face geometry character, designing an eye classifier, positioning eye initially, for some edge images, based on the symmetry face and eye, accurately positioning eye, thereby providing a method for positioning face symmetry axis based on lip auxiliary positioning, then dividing one eye from a positioned eye binary picture, according to the black pixel character of pupil, judging the eye state. The inventive method has real-time property, with the application for different backgrounds, lights, rotation and bias angles, and face details or the like to position eye and recognize eye state.

Description

technical field [0001] The invention belongs to the application field of image processing and pattern recognition technology, in particular to a human eye positioning and human eye state recognition method. Background technique [0002] Face detection and localization of facial organs is one of the most challenging research topics in the field of computer vision. As the most prominent feature of the human face, the human eye can provide more reliable and important information than the mouth and nose, so it is often the necessary processing object in face recognition. Human eye recognition plays an important role in face image recognition applications, such as face recognition, facial expression analysis, posture determination, visual tracking, human-computer interaction, fatigue detection, etc.; With precise positioning, other features such as eyebrows, nose, and mouth can be positioned more accurately based on the potential distribution relationship; eye positioning can al...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
Inventor 秦华标高永萍
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products