Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Face feature extraction method, system and device based on feature re-calibration and medium

A face feature and feature extraction technology, applied in the field of deep learning and image processing, can solve the problems of serious time consumption, poor robustness, and limited extraction ability, and achieve improved feature extraction ability, high feature extraction speed, and good feature extraction effect of effect

Pending Publication Date: 2021-03-19
FOSHAN UNIVERSITY
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the large number of parameters and the deep network, the image data requires a lot of calculations in the process of forward propagation, and the time consumption is serious.
In the process of distributed training, there is a greater demand for communication with the server, more resources are occupied, and the size of the model generated by training is larger, which is not easy to deploy in some mobile devices or devices with small memory
[0007] 2) The lightweight network has good real-time performance, but the feature extraction effect is not ideal
The lightweight face feature extraction network uses many load reduction measures (such as the use of depth separable convolution), so that it takes up less server resources during training, and when it is actually running, it is faster and the model size is smaller, but the lightweight network The feature extraction ability is limited, resulting in worse robustness, and its feature extraction is easily affected by objective factors such as image brightness and blur
[0008] 3) Existing deep networks rarely consider using the relationship between feature channels to improve network feature extraction capabilities
As the core of the convolutional neural network, the convolution kernel is usually regarded as an information aggregate that aggregates spatial information and feature dimension information on the local receptive field. However, only spatial information is used to train a powerful Networking is quite difficult

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face feature extraction method, system and device based on feature re-calibration and medium
  • Face feature extraction method, system and device based on feature re-calibration and medium
  • Face feature extraction method, system and device based on feature re-calibration and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0057] Such as figure 1 As shown, the present embodiment provides a method for extracting facial features based on feature recalibration, the method comprising the following steps:

[0058] S101. Construct a face feature extraction network.

[0059] Among them, the face feature extraction network is mainly composed of multiple feature extraction blocks stacked, and the specific structure of the face feature extraction network is as follows: figure 2 As shown, it specifically includes a standard convolutional layer connected in sequence, a first feature extraction block, a second feature extraction block, a third feature extraction block, a fourth feature extraction block, a global average pooling (Global Average Pooling, GAP) layer and Fully Connected (FC) layer, figure 2 The extraction blocks in all refer to the feature extraction blocks of this embodiment, and 227x227, 55x55, 28x28, etc. all refer to image resolutions.

[0060] The first feature extraction block consist...

Embodiment 2

[0099] Such as Figure 6 As shown, the present embodiment provides a facial feature extraction system based on feature recalibration. The system includes a facial feature extraction network construction unit 601, an image acquisition unit 602 and a feature extraction unit 603. The specific functions of each unit are as follows:

[0100] The facial feature extraction network construction unit 601 is configured to construct a facial feature extraction network; wherein, the facial feature extraction network includes multiple feature extraction blocks.

[0101] The image acquisition unit 602 is configured to acquire a face image of the user to be identified.

[0102] The feature extraction unit 603 is used to input the face image of the user to be identified into the face feature extraction network, perform preliminary feature extraction through the feature extraction block, and compress, screen and recalibrate the extracted preliminary feature information to obtain the face featu...

Embodiment 3

[0105] This embodiment provides a computer device, which can be a computer, such as Figure 7 As shown, a processor 702, a memory, an input device 703, a display 704 and a network interface 705 are connected through a system bus 701, the processor is used to provide computing and control capabilities, and the memory includes a non-volatile storage medium 706 and an internal memory 707, the non-volatile storage medium 706 stores an operating system, a computer program, and a database, the internal memory 707 provides an environment for the operation of the operating system and the computer program in the non-volatile storage medium, and the processor 702 executes the During computer program, realize the face feature extraction method of above-mentioned embodiment 1, as follows:

[0106] Construct human face feature extraction network; Wherein, described human face feature extraction network comprises multi-block feature extraction block;

[0107] Obtain the face image of the u...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a face feature extraction method, system and device based on feature re-calibration and a medium, and the method comprises the steps: constructing a face feature extraction network which comprises a plurality of feature extraction blocks; obtaining a face image of a to-be-recognized user; and inputting the face image of the user to be recognized into a face feature extraction network, performing preliminary feature extraction through a feature extraction block, and performing compression, screening and re-calibration on the extracted preliminary feature information to obtain face feature information. According to the method, the features can be screened by utilizing the relationship between the feature channels, the effect of re-calibrating the features is achieved,and a better feature extraction effect can be considered while a higher feature extraction speed is maintained.

Description

technical field [0001] The invention relates to a face feature extraction method, system, equipment and medium based on feature recalibration, and belongs to the field of deep learning and image processing. Background technique [0002] Face recognition is a biometric technology that automatically recognizes people based on their facial features (such as statistical or geometric features), also known as face recognition, portrait recognition, facial recognition, face recognition, facial recognition, etc. . Usually what we call face recognition is the abbreviation of identity recognition and verification based on optical face images. [0003] The process of face recognition is mainly divided into three parts: face detection, face feature extraction and face feature comparison. Among them, face feature extraction is the most critical step in face recognition, and its ability directly determines the recognition accuracy and anti-interference ability of the entire algorithm. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06V40/168
Inventor 曾凡智邹磊周燕
Owner FOSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products