Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sparse self-representation subspace clustering algorithm for self-adaptive local structure embedding

A local structure and clustering algorithm technology, applied in the information field, can solve the problems of sensitive adjacency graph quality, hindering the construction of high-quality graphs, etc., and achieve the effect of stable performance

Inactive Publication Date: 2021-02-26
NANJING UNIV OF POSTS & TELECOMM
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, many existing graph-based methods use two-stage methods, which are sensitive to the quality of the adjacency graph, since redundant features in the input space hinder the construction of high-quality graphs.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sparse self-representation subspace clustering algorithm for self-adaptive local structure embedding
  • Sparse self-representation subspace clustering algorithm for self-adaptive local structure embedding
  • Sparse self-representation subspace clustering algorithm for self-adaptive local structure embedding

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment

[0098] figure 2 Is this embodiment and other subspace clustering algorithms, namely PCA-K-means clustering, structured optimal graph feature selection (SOGFS), unsupervised discriminative feature selection (UDFS), sparse subspace clustering (SSC) Compared with the clustering accuracy curves of projection clustering (PCAN) with adaptive nearest neighbors (PCAN) under different numbers of selected features; it can be found that as the selected features increase, the clustering results become better; this trend It shows that the intrinsic structure of MSRA25 dataset is embedded in high-dimensional space; SSS outperforms other subspace clustering methods when it learns local structure from input space; image 3 As can be seen in , this example achieves good separation results on the wine dataset, which directly verifies that SSS learns a discriminative structure in the subspace; Figure 4 shows the process of graph S learning adjacency graph A, Figure 4 The non-spherical synth...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sparse self-representation subspace clustering algorithm for self-adaptive local structure embedding. The invention belongs to the technical field of information. According to the invention, the optimal subspace and the most distinct clustering structure in the low-dimensional space can be identified at the same time, and the invention is superior to other two-stage subspace clustering methods; in addition, a nonlinear manifold regularizer is introduced, so that the learning trade-off between an original space and a subspace can be dynamically utilized; a local structure in an original space is encoded into a dictionary by a sparse self-representation method, and adaptive learning can be carried out in a clustering process. According to the invention, the non-square l2, 1-norm is adopted to minimize the residual error, and different from other methods based on the square l2-norm, the SSS can realize stable performance because the model based on the square l2,1-norm has robustness to abnormal values and noise; experimental results on an actual benchmark data set show that the method can provide more interpretable clustering results, and the performance ofthe method is superior to that of other alternative schemes.

Description

technical field [0001] The invention belongs to the field of information technology, specifically relates to a sparse self-representational subspace (SSS) clustering model with adaptive local structure embedding, specifically integrates dimensionality reduction and clustering into a unified model, and proposes a dynamic A new clustering method that makes use of the learning between the original space and the subspace and has stable performance. Background technique [0002] Clustering is a fundamental technique in data mining and has been applied in a wide range of fields, including image segmentation, recommender systems, and text mining. Among various clustering methods, methods based on matrix factorization (MF) have attracted extensive research. This method aims to approximate a given matrix by the product of two or more factor matrices. In this way, the K-means algorithm can be interpreted as a non-negative matrix factorization (NMF), so that the input data matrix can...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/23213
Inventor 李大鹏戴金森蒋锐王小明徐友云
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products