Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-view subspace clustering method based on joint subspace learning

A technology of subspace learning and clustering methods, applied in the direction of instruments, character and pattern recognition, computer components, etc., can solve problems that affect the clustering effect

Inactive Publication Date: 2019-10-25
GUANGDONG UNIV OF TECH
View PDF0 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a multi-view subspace clustering method based on joint subspace learning to solve the problem that the existing multi-view clustering method depends on the quality of the original features and affects the clustering effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-view subspace clustering method based on joint subspace learning
  • Multi-view subspace clustering method based on joint subspace learning
  • Multi-view subspace clustering method based on joint subspace learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0073] A multi-view subspace clustering method based on joint subspace learning, such as figure 1 shown, including the following steps:

[0074] S1. Obtain the multi-view feature matrix of the original data set;

[0075] For each multi-view image in the original data set, different types of data features are extracted respectively, the data features of the same type are formed into the feature matrix of this view, and the feature matrices of all views are formed into the multi-view feature matrix of the original data set in Represents the vth view feature matrix in the original dataset, m v Indicates the dimension of the data features of the vth view, and n indicates the number of data features of the vth view; where V≥2, indicates the number of multi-views.

[0076] S2. Construct the objective function of multi-view subspace clustering based on joint subspace learning;

[0077] S2.1. Perform joint subspace learning on multi-view data features, and find the low-dimension...

Embodiment 2

[0122] A multi-view subspace clustering method based on joint subspace learning, such as figure 1 shown, including the following steps:

[0123] S1. Obtain the multi-view feature matrix of the original dataset

[0124] In this embodiment, the original data set used is the ORL face data set. The ORL face dataset contains a total of 400 face images from 40 people, each with 10 images.

[0125] For the multi-view images of ORL face dataset, the features of three views are extracted Includes: gray value features LBP characteristics and Gabor wavelets feature

[0126] S2. Construct the objective function of multi-view subspace clustering based on joint subspace learning;

[0127]

[0128] s.t.P (v)T x (v) =P (v)T x (v) Z+E (v) ,P (v)T P (v) =I,‖Z‖ 0 ≤T 0 , v∈[1,2,3]

[0129] Wherein in this embodiment 2, the balance factor α=0.7 is set, and the dimension d=150 of the low-dimensional embedding space;

[0130] S3. Using an alternate optimization method to s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-view subspace clustering method based on joint subspace learning, wherein the method comprises the steps: building a target function of multi-view subspace clustering based on joint subspace learning, searching low-dimensional embedded spaces of different view data features, carrying out the conversion and fusion of original features, and reducing the impact from the noise of original data; and data self-reconstruction learning is carried out in a low-dimensional embedded space, and a low-rank sparse self-reconstruction coefficient matrix with consistent views is searched, so that a more accurate similarity relationship of data is obtained, and the robustness of the algorithm is enhanced. According to the method, the joint subspace learning and the data self-reconstruction learning are unified into an optimization framework and are solved by using an alternate optimization method, and the joint subspace learning and the data self-reconstruction learningare mutually enhanced, so that the subspace clustering performance is greatly enhanced.

Description

technical field [0001] The present invention relates to the technical fields of computer vision and pattern recognition, and more specifically, to a multi-view subspace clustering method based on joint subspace learning. Background technique [0002] Compared with the single-view clustering method, multi-view learning obtains a more comprehensive and accurate representation of the data by utilizing the complementary information and consistency information between different views, so the learning model based on multi-view features can usually achieve better results. Clustering performance. [0003] Multi-view clustering can generally be divided into methods based on multi-graph fusion and methods based on subspace clustering. The method based on multi-graph fusion first constructs a local adjacency graph for each view, calculates the edge weights to represent the similarity relationship between data, and then fuses the adjacency graphs corresponding to all views into a new g...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/2321G06F18/251
Inventor 孟敏兰孟城武继刚
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products