Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Structured sparse representation and low-dimension embedding combined dictionary learning method

A sparse representation and low-dimensional embedding technology, which is applied in character and pattern recognition, instruments, computer parts, etc., can solve the problems of weak discrimination ability and poor discrimination of representation coefficient categories

Inactive Publication Date: 2018-09-25
XIAN UNIV OF TECH
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a dictionary learning method that combines structured sparse representation and low-dimensional embedding, which solves the problem of the high-dimensional characteristics of training sample data and the lack of strict block diagonalization structure in the dictionary learning method existing in the prior art. Constrained dictionary makes the coded representation coefficient class discrimination ability weak and the discrimination is not strong

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Structured sparse representation and low-dimension embedding combined dictionary learning method
  • Structured sparse representation and low-dimension embedding combined dictionary learning method
  • Structured sparse representation and low-dimension embedding combined dictionary learning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] The present invention will be described in detail below in combination with specific embodiments.

[0055]The present invention is a dictionary learning method that combines structured sparse representation and low-dimensional embedding. The dictionary construction and the dimensionality reduction projection matrix are carried out in parallel and alternately, and the sparse representation coefficient matrix with a block diagonal structure is forced to be enhanced in the low-dimensional projection space. The inter-class incoherence of dictionaries, while maintaining the intra-class correlation of dictionaries by exploiting the low-rank property of the representation coefficients on the class sub-dictionary. Dictionary construction and projection learning can promote each other to fully maintain the sparse structure of the data, so as to encode more class-discriminative coding coefficients. Specifically, follow the steps below:

[0056] Step 1. Read in the feature data se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a structured sparse representation and low-dimension embedding combined dictionary learning method. Dictionary construction and dimension-reduced matrix projection are carriedout in parallel alternatively, a sparse representation coefficient matrix forced to include a block diagonal structure in a low-dimension projection space is used to enhance inter-type incoherence ofa dictionary, intra-type coherence of the dictionary is maintained via low-rank performance of a representative coefficient in a type sub-dictionary, dictionary construction and projection learning prompt each other to main the sparse structure of data fully, and a coding coefficient with a higher type discriminating capability is coded. The problems that the coded representation coefficient is low in discriminating and distinguishing capabilities in an existing dictionary learning method in the prior art due to high maintaining characteristic of training sample data and lack of a dictionary with strictly constraint of the block diagonal structure.

Description

technical field [0001] The invention belongs to the technical field of digital image processing, and in particular relates to a dictionary learning method combining structured sparse representation and low-dimensional embedding. Background technique [0002] The core idea of ​​sparse representation is mainly based on the objective fact that many signals in nature can be represented or coded by a linear combination of only a few dictionary entries in an oversaturated dictionary. The most critical issue in sparse representation research is the construction of dictionaries with strong representation capabilities. At present, sparse representation technology is widely used in many application fields, such as image classification, face recognition and human action recognition. [0003] Dictionary learning is dedicated to learning an optimal dictionary from training samples in order to better represent or encode a given signal or feature. For classification recognition based on ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/46G06K9/62
CPCG06V10/40G06V10/513G06F18/214
Inventor 陈万军张二虎蔺广逢
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products