Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Two-Stage Recognition Method Based on Non-Negative Representation Coefficients

A non-negative technology for expressing coefficients, applied in the field of machine learning, can solve problems such as slow calculation speed, long time-consuming, complicated calculation process, etc., and achieve the effects of fast running speed, accurate classification results, and accurate recognition

Active Publication Date: 2021-10-08
YANGZHOU UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The sparse representation classification (Sparse representation classification, SRC) method proposed by John Wright et al. has been widely used, but because this method needs to calculate l 1 norm, so the calculation process is complex and time-consuming
Based on this, scholars such as Lei Zhang proposed a collaborative representation based classification (Collaborative representation based classification, CRC) method, which overcomes the shortcomings of the slow calculation speed of the SRC method. However, from the perspective of non-negative matrix decomposition, in CRC Negative values ​​do not have any physical meaning, these are the shortcoming and deficiency of algorithms such as CRC, and the method of the present invention hopes to solve this shortcoming

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Two-Stage Recognition Method Based on Non-Negative Representation Coefficients
  • A Two-Stage Recognition Method Based on Non-Negative Representation Coefficients
  • A Two-Stage Recognition Method Based on Non-Negative Representation Coefficients

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0017] In this embodiment, the FERET face database is used as the experimental data. The FERET face database is a database containing 200 people and 7 face images per person. In order to verify the effectiveness and practicability of the present invention, the present invention respectively selects the first m=1, 2, 3, 4, 5 images of each person as training samples, and uses the remaining 7-m images of each person as test samples , so the total number of training samples is 200×m, and the total number of test samples is 200×(7-m). The first seven images of a face as a training sample in this embodiment are as follows: figure 1 shown.

[0018] In this embodiment, the following definitions are made:

[0019] let x ij is a p-dimensional column vector and represents the j-th original training sample of the i-th class, i=1,2,...,c,j=1,2,...,n i , where n i is the number of training samples for each class, N=n 1 +n 2 +…+n c is the total number of training samples, training ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention proposes a two-stage recognition method based on non-negative representation coefficients. Including: use all training samples to linearly represent the test samples and the constraints indicate that the coefficients are non-negative, and calculate the coefficient vector; arrange the elements in the coefficient vector in descending order, extract larger coefficients and their corresponding training samples; classify the training samples obtained in the previous step , calculate the reconstructed image of each type of sample; use all reconstructed images to linearly represent the test sample and the constraint indicates that the coefficient is non-negative, calculate the coefficient vector, and classify the test sample according to the residual. The invention selects the training samples according to certain screening conditions and uses the training samples to linearly represent the test samples, thereby improving the recognition rate.

Description

technical field [0001] The invention belongs to the technical field of machine learning, and in particular relates to a two-stage recognition method based on non-negative representation coefficients. Background technique [0002] Feature extraction is very important in the field of pattern recognition. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are widely used as classic linear feature extraction methods. Both PCA and LDA methods ultimately need to rely on a classifier to classify the test samples. Among them, the more widely used classifiers such as the nearest neighbor (Nearest Neighbor, NN), the purpose of this classifier is to classify the test samples to the training samples closest to it. category. [0003] The newly proposed sparse representation is a new method in the field of face recognition. The basic idea is that given enough training samples, any test sample can be represented by a linear combination of training samples, and the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62
CPCG06F18/24133
Inventor 陈才扣李经善王蓉王禹
Owner YANGZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products