Check patentability & draft patents in minutes with Patsnap Eureka AI!

Face identification method based on sparse hybrid dictionary learning

A dictionary learning and dictionary technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems that do not take into account the commonality of different categories

Pending Publication Date: 2020-02-25
JIANGNAN UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Although the identification sub-dictionary can effectively extract the specificity and difference between categories, it does not take into account the commonality between different categories

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face identification method based on sparse hybrid dictionary learning
  • Face identification method based on sparse hybrid dictionary learning
  • Face identification method based on sparse hybrid dictionary learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0082] refer to figure 1 , which is the first embodiment of the present invention, provides a schematic flow chart of a face identification method for sparse mixed dictionary learning, as figure 1As shown, a face identification method based on sparse mixed dictionary learning includes obtaining face images for downsampling and dimensionality reduction as a training sample set, and constructing a category-specific dictionary model with Fisher's discriminant criterion and Laplacian matrix as constraints , use the category feature dictionary model to learn a sub-dictionary for each type of sample separately, thereby extracting the particularity between the categories of the training sample set, reducing the dispersion of intra-class coding while retaining the similarity of sparse coding data, and increasing Coding dispersion between classes; constructing an intra-class difference dictionary model, learning a dictionary for all samples, thereby extracting the category commonality ...

Embodiment 2

[0089] refer to Figure 2 to Figure 10 , is the second embodiment of the present invention, and what this embodiment is different from the first embodiment is: the experimental environment is a 64-bit Windows 10 operating system, memory 32GB, Intel (R) Xeon (R) CPU E5-2620 v4 @2.10GHZ, and programmed with MatlabR2016b software. The experimental images have been standardized, and the CMU-PIE face database, AR face database, and LFW face database are selected for experiments. The comparison methods include: SRC, FDDL, CRC, SVGDL, and CSICVDL. 1. AR database experiment: Randomly select 100 people from the AR face database for the experiment, and each person's pictures are divided into 5 sets, refer to figure 2 , take the two faces with no illumination expression changes as training pictures, and the rest are divided into 4 sets as test pictures. Collection S 1 is the test data including expression changes; set S 2 is the test data including illumination changes; set S 3 Tes...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a feature extraction method based on sparse hybrid dictionary learning, and the method is applied to face identification, and can improve the accuracy of face identification toa certain extent. According to the method, the Fisher discriminant criterion and the Laplace matrix are used as constraints, the particularity between data categories is extracted by using a categoryfeature dictionary, the similarity of sparse coded data is reserved, the intra-category coding dispersion is reduced, and the inter-category coding dispersion is increased; and then category generality is extracted by utilizing the intra-category difference dictionary, the same features of different categories are captured, and finally the category feature dictionary is combined with the intra-category difference dictionary. Experiments are carried out on face databases of AR, CMU-PIE, LFW and the like by using the method, and results show that the method can obtain higher recognition precision under the condition of few-sample training.

Description

technical field [0001] The invention relates to the technical field of computer image processing, in particular to image feature extraction and classification. Background technique [0002] With the innovation of computer hardware technology and the development of software technology, face recognition is gradually applied in different fields such as economic engineering and social security. Although the face recognition method based on deep learning has a high recognition rate, it relies on the need for a large number of data samples, high hardware definition equipment, and several days of training time. Compared with it, face recognition training based on sparse representation is simple and robust to noise, which has attracted widespread attention from scholars at home and abroad in recent years. [0003] In 2009, J.Wright et al. proposed Sparse Representation Based Classification (SRC). This method is based on the illumination model, assuming that any test sample can be ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/172G06F18/28G06F18/2135G06F18/24
Inventor 狄岚矫慧文顾雨迪
Owner JIANGNAN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More