Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Face quality evaluation method and device

A quality evaluation and target face technology, applied in the image field, can solve the problems of high sample construction difficulty, high training difficulty, wrong recognition results, etc., to improve analysis efficiency or monitor analysis accuracy, save display space and storage space, The effect of improved operability

Pending Publication Date: 2020-02-25
HUAWEI TECH CO LTD
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] 2) For face recognition, on the one hand, low-quality face pictures will bring noise and lead to wrong recognition results; on the other hand, face recognition models are usually expensive and cannot perform feature extraction on every frame of face
It is equivalent to requiring the breadth and depth of data at the same time. Such a data set is difficult to obtain, and sample construction is extremely difficult.
Moreover, the training of Triplet Loss is very skillful and difficult to train, otherwise the model will be difficult to converge

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face quality evaluation method and device
  • Face quality evaluation method and device
  • Face quality evaluation method and device

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0105] Such as Figure 7 As shown as an example, the first neural network includes Net1, Net2, and Net3. Net1 includes and is not limited to Conv (convolution), BN (batch normalization), Pooling (pooling), Relu (corrected linear unit); Net2 includes and is not limited to Deconv (deconvolution), Conv, Relu, Upsampling (upsampling ) layer; Net3 includes and is not limited to Fc (full connection), Pooling (pooling), Relu (corrected linear unit), and Regression (regression).

[0106] S21, for any input face image, output a featuremap (feature map) of M*W*H after Net1 and Net2 processing. Take out the M*W*H Feature Map output by Net2, and judge the visibility of each key part / key point of the face according to the response of the Feature Map. M here represents the number of key points that the user cares about. For example, if a configuration is: eyes*2+nose tip*1+mouth corner*2, there are five points in total, then M here is 5. by Figure 7 For example, the three key points of...

example 2

[0127] Such as Figure 8 As shown as an example, the first neural network includes Net1, Net2, and Net3. Net1 includes and is not limited to Conv (convolution), BN (batch normalization), Pooling (pooling), Relu (corrected linear unit); Net2 includes and is not limited to Deconv (deconvolution), Conv, Relu, Upsampling (upsampling ) layer; Net3 includes and is not limited to Fc (full connection), Pooling (pooling), Relu (corrected linear unit), and Regression (regression).

[0128] S31, obtain the visibility score P of each key point in each face image as above-mentioned S21 i And the yaw angle, pitch angle, and roll angle of the face.

[0129] S32, the key point visibility score and the face Euler angle are no longer calculated as the total score of the face quality through the formula in Example 1, but the key point visibility score and the face Euler angle are fused as a trained score The input of the network (the second neural network), and the output result of the score ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a face quality evaluation method and device, and the method comprises the steps: carrying out the cutting and tracking of a monitored and detected image of a target person, andobtaining N face images of the target person; for each face image, performing face quality evaluation on the N face images according to face key point information and a face Euler angle in the imageto obtain a face quality score, wherein the face key point information comprises preset face key points and visibility degrees of the preset face key points at all positions in the image; and selecting the face image with the best face quality evaluation result from the N face images as a target face image.

Description

technical field [0001] The present application relates to the field of image technology, in particular to a face quality evaluation method and device. Background technique [0002] Face capture and recognition systems are an extremely important part of video surveillance and security. Face capture and recognition algorithms usually consist of four parts: face detection, face tracking, face quality evaluation, and face recognition. Face quality evaluation is an indispensable part of it, and its significance lies in: [0003] 1) In the multi-frame trajectory of the same target, select one as the face snapshot. Whether the most suitable snapshot can be selected through face quality evaluation directly determines the output quality of the entire face capture system, and thus determines whether investigators can recognize criminal suspects through our face capture system. [0004] 2) For face recognition, on the one hand, low-quality face pictures will bring noise and lead to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/166G06V40/168G06V20/52G06F18/214
Inventor 董新帅王铭学蔡佳王提政
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products