Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing image dictionary learning classification method based on utilization of spatial relationship

A remote sensing image and dictionary learning technology, which is applied in character and pattern recognition, instruments, computer parts, etc., can solve the problems that some pixels cannot be effectively sparsely expressed, and some pixels cannot be effectively and correctly classified, so as to achieve the training method flexible effects

Active Publication Date: 2019-11-05
NANJING UNIV
View PDF3 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When the number of selected dictionaries is small, due to the relatively small size of the dictionary, it is usually not possible to effectively sparsely express some pixels, so that some pixels cannot be effectively and correctly classified

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing image dictionary learning classification method based on utilization of spatial relationship
  • Remote sensing image dictionary learning classification method based on utilization of spatial relationship
  • Remote sensing image dictionary learning classification method based on utilization of spatial relationship

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Below in conjunction with specific embodiment, further illustrate the present invention, should be understood that these embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention, after having read the present invention, those skilled in the art will understand various equivalent forms of the present invention All modifications fall within the scope defined by the appended claims of the present application.

[0033] The purpose of the remote sensing image dictionary learning classification method using spatial relationship is to train a complete dictionary using the local neighbor spatial relationship information according to the homogeneity and heterogeneity of the local spatial neighborhood of the remote sensing image, and to classify each pixel neighborhood of the remote sensing image based on the most The optimal dictionary is sparsely encoded to obtain the sparse discriminant coefficient features, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a remote sensing image dictionary learning classification method based on the utilization of the spatial relationship. The method comprises the following steps: firstly regarding a p-neighborhood set of each pixel as a training unit, and carrying out the spatial relationship information extraction and expression of each training unit; then introducing local neighbor space relationship information to construct a dictionary learning model based on local neighbor region joint representation, and training by using features extracted from each training unit by means of an online dictionary updating mechanism to obtain an optimal dictionary set; and finally, performing sparse coding on the p-neighborhood set associated with each pixel based on the optimal dictionary obtained by training, and training a linear support vector machine model to classify unlabeled pixels based on the obtained sparse discrimination coefficient features and labeling information. According tothe invention, joint sparse representation is carried out on local neighbor area pixels of the remote sensing image, so that the constructed dictionary learning model can fully perceive potential spatial relationship information in the image to achieve the purpose of accurately identifying a ground object target.

Description

technical field [0001] The invention relates to a method for learning and classifying remote sensing image dictionaries using spatial relationships, which can be used in complex hyperspectral image scene ground surface fine interpretation tasks, and belongs to the technical field of remote sensing big data intelligent interpretation. Background technique [0002] At present, new machine learning methods and new models have become a hot research direction in remote sensing image data processing. Sparse representation, as a new type of machine learning technology, utilizes the high redundancy of massive high-dimensional data and the sparsity of interested signals. It can effectively extract the ground object information from the remote sensing image. Existing remote sensing image classification methods based on sparse representation usually construct a dictionary set directly from the labeled training samples, and determine the unlabeled pixel and feature category on the image...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/2136G06F18/28G06F18/24G06F18/2411G06F18/214
Inventor 甘乐詹德川
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products