Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Low-quality face comparison method based on deep convolution neural network

A deep convolution and neural network technology, applied in the field of face comparison, can solve the problems of reducing the flexibility of face comparison technology and limiting the application range of face comparison technology, achieving both accuracy and high efficiency of face comparison Comparison, the effect of less computing resources

Inactive Publication Date: 2018-09-25
上海敏识网络科技有限公司
View PDF5 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the application of face comparison technology based on convolutional neural network structure, compared with higher quality face data, lower quality face data often requires higher complexity and A model that requires higher computing resources, which greatly limits the application range of face comparison technology, and also reduces the flexibility of face comparison technology.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Low-quality face comparison method based on deep convolution neural network
  • Low-quality face comparison method based on deep convolution neural network
  • Low-quality face comparison method based on deep convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0027] This embodiment proposes a high-efficiency comparison method for low-quality faces based on deep convolutional neural networks, see figure 1 . The specific process is as follows:

[0028] S101: Establish a face training database, establish a low-quality face verification database, and establish a low-quality face test database;

[0029] S102: Normalize each image in the face training database, and input the processed data tensor into the constructed deep convolutional neural network plus fully connected layer to extract features;

[0030] S103: Through the spherical loss function, use the gradient descent method to find the weight value in the fully connected layer and the weight value of each filter in the deep convolutional neural network. During the training process, record the statistics of the low-quality face recognition verification database at the same time. result

[0031] S104: After the training is completed, select the deep convolutional neural network mo...

Embodiment 2

[0076] This embodiment makes some adjustments and extensions to the scheme in embodiment 1, see image 3 :

[0077] Step 201: Using the network structure in Embodiment 1, set the number of filters of the input layer of the convolutional layer of the second stage of the network and the convolutional layer of the residual unit class to 128, and set the input of the convolutional layer of the third stage of the network The number of filters of the convolutional layer of the layer and the residual unit class is 256, and the number of filters of the convolutional layer input layer of the fourth stage of the network and the convolutional layer of the residual unit class is set to 512;

[0078]Step 202: add a convolution adaptation layer after the ResNet network structure, the parameters of the convolution adaptation layer are filter kernel size 3x3, the number of filters is 256, and the convolution step length is 1, abbreviated as k(3,3), 256,s1;

[0079] Step 203: Connect the net...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a low-quality face comparison method based on a deep convolution neural network. The method comprises the following steps: each image in a face training database is sent to a constructed deep convolution neural network for feature extraction; the extracted features are inputted to a fully-connected layer and are projected to a projection matrix in low-latitude space throughaffine; feature vectors obtained and calculated by the projection matrix are trained through a two-norm normalized spherical loss function; through a gradient descent method, the weight value of eachfilter in the fully-connected layer and the deep convolution neural network is found out, and the deep convolution neural network with a comparison passing rate to be the highest is selected; and thefeature vector of a to-be-detected face image and the feature vector of each fact answer image in a low-quality face test database are subjected to cosine distance calculation, and the same person isjudged if the cosine value is smaller than a threshold. The method is used for high-efficiency comparison on a low-quality face, fewer calculation resources are used, and both the face comparison precision and the comparison speed can be considered at the same time.

Description

technical field [0001] The invention relates to a method for comparing human faces, in particular to a method for comparing low-quality human faces based on a deep convolutional neural network. Background technique [0002] Nowadays, the application of personal identity authentication of target users is ubiquitous, and various identification technologies used for authentication have huge market demand in many fields, among which face recognition has the advantages of naturalness, friendliness, less user interference, and convenience. Very broad application prospects, such as video surveillance, access control systems, security checks, and user identity confirmation. [0003] Face comparison is based on digital image processing, computer vision and machine learning technology, using computer processing technology to analyze and compare target face images. Nowadays, face comparison technology is basically divided into comparison based on artificial or shallow features and fac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04
CPCG06V40/16G06V40/167G06V10/443G06N3/045G06F18/214
Inventor 王毅翔方志刚戚丹青赵丽娟
Owner 上海敏识网络科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products