Face feature extraction method based on depth learning

A face feature and deep learning technology, applied in the direction of neural learning methods, instruments, biological neural network models, etc., can solve the problems of inaccurate positioning of faces, large dimension of feature vectors, low recognition efficiency, etc., and achieve recognition rate Excellent, high accuracy, good robustness effect

Inactive Publication Date: 2018-06-01
深圳市恩钛控股有限公司
View PDF4 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0012] The present invention provides a face feature extraction based on deep learning according to the problems existing in the prior art that the key points in the face cannot be accurately located, the recognition efficiency is low, the dimension of the feature vector is relatively large, and the calculation and analysis are complicated. method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face feature extraction method based on depth learning
  • Face feature extraction method based on depth learning
  • Face feature extraction method based on depth learning

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0098] The first embodiment of the present invention: the initial input image size is set to 227*227, the convolution kernel size is set to K*K, the step size is set to S, the padding size is set to P, the output number is set to NUM, and the weight is initialized randomly The system is set to θ i .

[0099] combine figure 2 Shown: the construction of the face feature extraction network in the described step 3 is as follows:

[0100] The first layer: data input layer, input image data: data length and width are 227*227, image data value range: 0-255;

[0101] The second layer: convolution layer conv1: convolution kernel size: 11*11; step size: 4; output number: 96;

[0102] The third layer: activation layer relu1: use the relu function as the activation function;

[0103] The fourth layer: pooling layer pool1: convolution kernel size: 3*3; step size: 2; pooling type: maximum value; padding: 1;

[0104] The fifth layer: convolution layer: convolution kernel size: 1*1; ste...

no. 2 example

[0147] The network training described in step 5 includes performing convolution operations, pooling, and activation operations on the original data to obtain the final features;

[0148] The convolution refers to performing a convolution operation on an image with a template,

[0149] The pooling operation refers to sampling the image, and the pooling operation is average pooling and maximum pooling; average pooling refers to calculating the average value of the image area as the value after pooling of the area; maximum pooling means to select the maximum value of the image area as the pooled value of the area;

[0150] The activation operation refers to using the convolution kernel to calculate the activation value of the neuron, providing the nonlinear modeling capability of the network, thereby increasing the feature description capability of the network, specifically defined as the following formula:

[0151] f(x)=max(0,x)

[0152] Where x is the value of the current neu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A face feature extraction method based on depth learning comprises the following steps: 1, initializing an input image with the dimension as W*W, a convolution kernel size as K*K, a step length as S,a filling size as P, an output number as NUM, and weight in each layer as [theta] i; 2, preparing face image data; 3, building a deep learning face feature extraction network with 46 layers; 4, inputting the prepared face image data into the face feature extraction network formed in step 3, and training a softmax classifier; 5, starting the face feature extraction network, carrying out network training for a time of T, using fine tuning skills to improve the precision of the face feature extraction network, finally obtaining a weight system [theta] as the solved model; using the model to extract features from known samples, and finishing the flow.

Description

technical field [0001] The invention belongs to the technical field of electric boxes and electric box testers, in particular to a PUD test stand. Background technique [0002] Face recognition is a biometric technology for identification based on human facial feature information. Compared with other biometrics, facial features have the advantages of naturalness, convenience, and non-contact, which make them have great application prospects in security monitoring, identity verification, and human-computer interaction. [0003] With the wide application of surveillance cameras, the market demand for face recognition systems is also gradually expanding. However, in these applications, most of the monitored people are in an unconstrained state. The current face recognition products and face recognition systems need to The detected faces have certain limitations or requirements. These restrictions have become the main obstacles to the promotion and application of face recognit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06V40/168
Inventor 胡钟山
Owner 深圳市恩钛控股有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products