Multi-dimensional output face quality assessment method and electronic equipment based on deep learning

A quality assessment and deep learning technology, applied in the field of image recognition, can solve the problems of increasing the accuracy of face quality assessment, increasing time consumption and computing resources, etc., achieving the effect of short time consumption, few model parameters and high execution efficiency

Active Publication Date: 2022-06-24
FENGHUO COMM SCI & TECH CO LTD
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above defects or improvement needs of the prior art, the present invention provides a face quality assessment method based on multi-dimensional output of deep learning, which is used to solve the problem of increasing time-consuming and computing resources when multiple models run at the same time, increasing human Accuracy of face quality assessment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-dimensional output face quality assessment method and electronic equipment based on deep learning
  • Multi-dimensional output face quality assessment method and electronic equipment based on deep learning
  • Multi-dimensional output face quality assessment method and electronic equipment based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] In order to make the objectives, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not conflict with each other.

[0056] In order to solve the problems existing in the prior art, the present invention provides a multi-dimensional output face quality assessment method based on deep learning. First, a multi-dimensional output neural network model is designed, such as figure 1 As shown, the model has four output branches: Score, Class, Mask, Pose, which predict different tasks respectively. Input a face image to ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for evaluating the quality of human faces based on deep learning multi-dimensional output: prepare training data sets, Score training sets, Class training sets, Mask training sets, Pose training sets; train network models, randomly select from four training sets Select a part of the pictures and merge them into a batch of pictures, send them to the neural network model, and get the output values ​​of the four branches after forward reasoning through the neural network, and calculate the loss value of the corresponding branch according to which data set the input picture comes from. Finally, add the loss value of each branch according to different weights to obtain the total loss value, which is used for network backpropagation to update network parameters; predict the face image to be tested, input a face image, and send it after preprocessing Enter the trained neural network model for forward reasoning, output the predicted values ​​of the four branches, and finally add the output values ​​of the four branches according to the weight to obtain the final comprehensive evaluation score of face quality. The invention also provides corresponding electronic equipment.

Description

technical field [0001] The invention belongs to the technical field of image recognition, and more particularly, relates to a multi-dimensional output face quality assessment method and electronic device based on deep learning. Background technique [0002] During the face capture process of edge devices, due to the influence of environmental changes and human motion, there are low-quality face images such as blur, occlusion, and posture changes in the captured face images. These low-quality human images will be greatly reduced. Accuracy of face recognition systems. At the same time, the storage space and transmission bandwidth of edge devices are very limited, and a large number of low-quality face pictures are not conducive to the storage and transmission of face pictures. In order to be able to select one or more high-quality face images from a large number of face images, a face quality assessment method is needed. [0003] The factors that affect the quality of the fa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/16G06V10/764G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/161G06N3/047G06N3/044G06F18/2415
Inventor 梁奔香杜兵罗翚
Owner FENGHUO COMM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products