Human eye positioning method based on edge information integral projection
A technology of integral projection and human eye positioning. It is used in instruments, character and pattern recognition, computer parts and other directions. It can solve problems such as human eye positioning errors, multiple prior knowledge, and harsh face pose requirements, and achieve good adaptation. performance, improve accuracy
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0014] The present invention will be described in further detail below in conjunction with the embodiments and accompanying drawings.
[0015] Before the human eye positioning, the human face area must be determined in a given image, and then the human eye positioning is carried out in the human face area. Since the skin color is a prominent feature of the face, the extraction of the face area based on the skin color feature and the detection of the face combined with the geometric features of the face can reduce the impact of changes in the face posture and expression, and the detection results are relatively stable. sex.
[0016] The RGB color space is a commonly used color space, but the chroma and brightness information in this space are not easy to separate, which makes factors such as illumination have a great impact on the skin color extraction results, which is not conducive to the accurate judgment and extraction of skin color. In YCrCb color space, the brightness an...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com