Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Head Pose Estimation Method Based on Multi-feature Point Set Active Shape Model

A technology of active shape model and head posture, which is applied in computing, computer parts, character and pattern recognition, etc., can solve the problems of unstable feature point positioning and affecting the accuracy of head posture, etc., to improve accuracy and overcome The effect of inaccurate positioning

Active Publication Date: 2017-02-08
苏州猫头鹰智能科技有限公司
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in these processes, due to the ever-changing head posture, camera distortion, lighting and other environmental factors, as well as the character's appearance and expression, and factors such as hats and glasses occlusion, the positioning of feature points is often not stable enough. Affects the accuracy of the final head pose estimation
Because of this, head pose estimation has always been a very challenging subject

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Head Pose Estimation Method Based on Multi-feature Point Set Active Shape Model
  • Head Pose Estimation Method Based on Multi-feature Point Set Active Shape Model
  • Head Pose Estimation Method Based on Multi-feature Point Set Active Shape Model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] The present invention will be further described below in conjunction with accompanying drawing.

[0033] ASM is based on the point distribution model (Point Distribution Model, PDM), obtains the statistical information of the distribution of feature points in the sample by training image samples, and obtains the change direction that the feature points allow to exist, so as to find the corresponding features on the target image point location. The training samples need to manually mark the positions of all the feature points, record the coordinates of the feature points, and calculate the local grayscale model corresponding to each feature point as the feature vector for local feature point adjustment. Then put the trained model on the target image to find the next position of each feature point. Use the local grayscale model to find the feature point with the smallest Mahalanobis distance of the local grayscale model in the direction specified by the current feature p...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a head posture estimation method based on a multi-feature-point set active shape model (ASM). The method includes the steps that firstly, face samples are trained to obtain the global ASM and a local texture model; secondarily, according to the models obtained through training, face feature point fitting is performed on an obtained face image sequence, feature point coordinates are stored, and reference coordinates are updated periodically; then, the displacement of all feature points is calculated, and the number of the feature points exceeding a displacement threshold value is counted; finally, according to the counted number of the feature points and the displacement direction, head postures are estimated. Influences, caused by inaccurate positioning on a small number of feature points, on head posture estimation can be reduced, meanwhile, the head posture estimation method has a high robustness effect on illumination, various head postures, such as the front face, left turning, right turning, head raising and head lowering, can be estimated, and the head posture estimation method has great application prospects in the fields of intelligent video monitoring, virtual reality, mode recognition, man-machine interaction and others.

Description

technical field [0001] The invention belongs to the fields of computer pattern recognition, computer vision and human-computer interaction, and relates to a method for estimating the head posture of a face image, in particular to a method for estimating the head posture based on a multi-feature point set active shape model. Background technique [0002] Head pose contains rich human emotional information, which can well express the true thoughts deep in the human heart. Therefore, head pose estimation has great research and application value in the field of computer vision. It is an important link in the research of intelligent video surveillance, virtual reality, pattern recognition and human-computer interaction. The result of the pose estimation will directly affect the final System stability. [0003] At present, it is still very difficult to make computers have the same recognition ability as humans. In machine vision, head pose estimation needs to be able to detect a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62
Inventor 佘青山杨伟健陈希豪
Owner 苏州猫头鹰智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products