Action living body recognition method based on attitude estimation and action detection

A gesture estimation and motion detection technology, applied in the field of biometrics, can solve the problems of non-conformance of actions, easy failure of action recognition, and difficulty in adapting the threshold to big and small eyes.

Inactive Publication Date: 2020-10-30
成都新希望金融信息有限公司
View PDF12 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0013] (1) Without occlusion detection, for example, if the user wears a mask on the face, or the user wears sunglasses, or the user covers the mouth with his hand, etc., the recognition accuracy of opening the mouth and blinking will become worse at this time, and it is still done in a living body Finally, tell the user that the failure has occurred, and it cannot be discovered in advance, so it will be reminded in time
[0014] (2) The real-time user’s face is not aligned with the circle on the screen for recognition. As a result, the user’s face may not be in the camera during the process of doing it, and finally the action recognition fails because the face is not detected, or the action does not meet the specifications and fails.
[0015] (3) The recognition of blinking and opening the mou...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action living body recognition method based on attitude estimation and action detection
  • Action living body recognition method based on attitude estimation and action detection
  • Action living body recognition method based on attitude estimation and action detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0056] Please refer to Figure 1 to Figure 7 , this embodiment discloses a method for recognizing a living body based on pose estimation and motion detection, including real-time alignment screen recognition, random motion recognition, face recap recognition and face comparison.

[0057] 1. Real-time alignment screen recognition

[0058] The recognition process includes: using a camera to take a photo of a face, performing light recognition on the photo of the face, recognition of the face in a circle, recognition of the face facing the screen, and recognition of the occlusion of the face. In the circle, when the face is facing the screen and the face is not blocked, the real-time alignment with the screen will pass the recognition, otherwise it will not pass.

[0059] 1) Light recognition

[0060] In this process, the light is divided into strong light, normal light and weak light with decreasing intensity in sequence. If the light in the face photo is recognized as normal ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an action living body recognition method based on attitude estimation and action detection, and relates to the technical field of biological recognition. The method comprises the steps: recognizing a real person and an attack by enabling a user to cooperate to do an action, wherein an action instruction set is random, so whether the user is a living body or not can be judged more accurately; before a living body is made, in order to improve the one-time passing rate of a real person, carrying out the light recognition, face recognition in a circle, face alignment screenrecognition and face occlusion recognition on a user; estimating a depression angle, a deflection angle and a rotation angle of the user by adopting a posture estimation PSECNN model, and performinghead shaking, nodding and head raising recognition on the user, so as to accurately recognize the head action of the user; when blinking and mouth opening actions are identified, adopting an MCNN model and an ECNN model to replace mainstream detection based on feature points, so the blinking and mouth opening identification rate is ensured in precision; in vivo detection, action instructions are random, actions and action sequences need to be accurate, anti-copying recognition is carried out, and safety is higher.

Description

technical field [0001] The present invention relates to the technical field of biometric identification, in particular to a method for identifying a living body based on pose estimation and action detection. Background technique [0002] Liveness detection is to study how to distinguish whether the face captured by the current camera comes from a real living body (that is, a person) or an attack (including face photos, videos, masks, etc.). Face camouflage usually includes multiple methods. In the current business environment where face recognition is one of the important means of identification, this series of frauds will pose a potential threat to users of the face recognition identity system. Therefore, regardless of research and business The field has taken many measures on how to identify face camouflage attacks. Liveness detection is fundamentally a binary classification problem. To find the difference between live and non-living images, it mainly includes the followin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00
CPCG06V40/161G06V40/20G06V40/172G06V40/45
Inventor 吕文勇王小东赵小诣程序
Owner 成都新希望金融信息有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products