Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Linear characteristic extracting method used for k nearest neighbour classification

A linear feature and k-nearest neighbor technology, which is applied in the direction of instruments, character and pattern recognition, computer components, etc., can solve the problem of not being able to determine the optimal classification dimension of the potential manifold, the disaster of dimensionality, and the high cost of computing and storage.

Inactive Publication Date: 2008-04-09
FUDAN UNIV
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

High-dimensional pattern features usually bring two problems: 1) high calculation and storage costs, 2) dimensionality disaster problem
But none of these three methods can determine the optimal categorical dimension problem for the latent manifold, and also suffer from the matrix singularity problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Linear characteristic extracting method used for k nearest neighbour classification
  • Linear characteristic extracting method used for k nearest neighbour classification
  • Linear characteristic extracting method used for k nearest neighbour classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0061] Embodiment 1: The application of the present invention on the UCI data set Sonar uses k-nearest neighbor classification as a classifier. For each data set, it is randomly divided into two subsets 40% and 60%, first 40% for training and 60% for testing; then 60% for training and 40% for testing. The training method is the calculation steps outlined above. During the test, the test space is transformed by using the linear mapping obtained on the training set, and the accuracy rate is calculated by the k-nearest neighbor classifier. The upper row shows a trend plot of classification accuracy as the number of dimensions increases. The middle and lower rows show the eigenvalues ​​and their cumulative curves corresponding to this dimension. Figure 1 contains both the performance on the training set and the test set. The classification performance on the training set is obtained by leaving one out (Leave OneOut). From the figure, we can see that when the eigenvalue is less ...

Embodiment 2

[0062] Embodiment 2: The application of the present invention on the ORL face data set uses k-nearest neighbor classification as a classifier. The UMIST face dataset contains 40 people, that is, 40 categories. For each class, we randomly select p samples for training, totaling 40p. Since the input image size is 56*46, the original space dimension is 2576, and the number of zero feature values ​​exceeds 2000. Because when the eigenvalue is zero, the classification accuracy has tended to be stable. For clarity, we display the negative eigenvalue and positive eigenvalue parts in a pair of graphs, and the part corresponding to the zero eigenvalue is not shown in the graph. Also from Figure 2 we can see that when the eigenvalue is less than 0, as the dimension increases, the classification accuracy rate increases; when the eigenvalue is close to 0, the classification accuracy rate tends to be stable; when the eigenvalue is greater than 0 When , increasing the dimension not only d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to statistic pattern recognition and machine learning technical fields, especially is a linear feature extraction for k near neighbor classification. The invention provides that an adjacency image model and an identifying adjacency matrix representing the image model can be formed by a plurality of classes of data points under action of a local attractive force and a local expulsive force. A symmetrical matrix is derived from a target function, and as the result by performing spectral analysis on the matrix, the maximum precision of near neighbor classification can be obtained when the data is mapped to a space which is formed by tensing the eigenvectors corresponding to the negative eigenvalue of the matrix. The method determines the optimized dimension number of a new space suitable for classification by spectral analysis rather than debugging in experiments. Furthermore, the method also has the advantages of non-parameter, non-iteration, preventing local minimum and singular matrix.

Description

technical field [0001] The invention belongs to the technical field of statistical pattern recognition and machine learning, and in particular relates to a linear feature extraction method for k-nearest neighbor classification. Background technique [0002] Pattern classification is one of the most fundamental research tasks in the field of machine learning. In practical problems, each dimension of each pattern is a feature of the pattern. Since it is usually not known which features are effective for classification, it is usually to collect as many data features as possible, and then hand them over to the classifier for judgment. Therefore, features representing patterns are usually high-dimensional. High-dimensional pattern features usually bring two problems: 1) high calculation and storage costs, and 2) the curse of dimensionality. The curse of dimensionality problem is the main problem faced by many pattern recognition methods in practical applications, such as face ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/46G06K9/62
CPCG06K9/6234G06K9/6252G06K9/6276G06F18/2132G06F18/21375G06F18/24147
Inventor 张巍薛向阳孙子晨郭跃飞
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products