Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Face recognition method of deep convolutional neural network

A neural network and face recognition technology, applied in the face recognition field of deep convolutional neural network, can solve the problems of reduced test sample recognition rate, harsh time complexity, and high time complexity, and is conducive to classification and recognition, The effect of reducing time complexity and strong classification ability

Inactive Publication Date: 2015-08-26
BEIJING UNIV OF TECH
View PDF2 Cites 55 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, through experiments, it is found that when the training samples are reduced, the recognition rate of the algorithm for the test samples will drop sharply; especially when the samples used for training are less than 20,000, the recognition rate is only 59.21%.
This deep neural network learning model requires a large number of labeled samples for training, and the time complexity is extremely high. It often requires tens of thousands of iterative updates to obtain better recognition performance.
Then in practical applications, the cost of sample labeling is very high, and the time complexity requirements are also very demanding (for example, real-time recognition is sometimes required)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face recognition method of deep convolutional neural network
  • Face recognition method of deep convolutional neural network
  • Face recognition method of deep convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The face recognition method of this deep convolutional neural network includes a training phase and a classification phase. The training phase includes the following steps:

[0027] (1) Randomly generate the weight w between the input unit and the hidden unit j And hidden unit bias b j , J=1,...,L, j is the number of weights and offsets, there are L in total;

[0028] (2) Input the training image Y and its label, and use the forward conduction formula h W,b (x)=f(W T x), where h W,b (x) is the output value, x is the input, calculate the output value of each layer h W,b (x (i) );

[0029] (3) Calculate the deviation of the last layer according to the label value and the output value of the last layer of step (2) by formula (4)

[0030] δ i ( nl ) = ∂ J 1 ∂ Z i ( nl ) = ∂ ∂ Z i ( nl ) 1 2 | | h W , b ( x ( i ) ) - y ( i ) | | 2 - - - ( 4...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a face recognition method of a deep convolutional neural network, which reduces the time complexity, and enables a weight in the network to still have a high classification capacity under the condition of reducing the number of training samples. The face recognition method comprises a training stage and a classification stage. The training stage comprises the steps of (1) randomly generating a weight wj between an input unit and a hidden unit and an offset bj of the hidden unit, wherein j equals to 1,...,L and represents the number of the weight and the offset, and the total number is L; (2) inputting a training image Y and a label thereof, by using a forward conduction formula hw, b(x)=f(W<T>x), wherein hw, b(x) is an output value, x is input, and an output value hw, b(x<(i)>) of each layer is calculated; (3) calculating the offset of the last layer according to a label value and an output value of the last layer; (4) calculating the offset of each layer according to the offset of the last layer, and acquiring the gradient direction; and (5) updating the weight. The classification stage comprises the steps of (a) keeping all parameters in the network to be unchanged, and recording a category vector outputted by the network of each training sample; (b) calculating a residual error delta, wherein delta=||hw, b(x<(i)>)-y<(i)>||<2>; and (c) classifying a tested image according to the minimum residual error.

Description

Technical field [0001] The invention belongs to the technical field of image processing and pattern recognition, and specifically relates to a face recognition method of a deep convolutional neural network. Background technique [0002] Feature extraction has always been one of the difficulties in the field of pattern recognition. Traditional feature-based recognition methods all define a feature in advance, and then perform classification and recognition based on the defined feature. As the development of traditional machine learning, deep learning has been widely used in many fields because it can automatically learn more suitable representation features layer by layer. The general deep learning algorithm will lose the structural information of the original image when performing image recognition, thus affecting the recognition effect. As one of the methods of deep learning, convolutional neural network, on the premise of inheriting deep learning to automatically learn and ext...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/66
Inventor 孙艳丰齐光磊胡永利
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products