Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Polarized SAR (Synthetic Aperture Radar) image classifying method based on cooperative representation and deep learning.

A collaborative representation and deep learning technology, applied in the field of image processing, can solve problems such as high computational complexity, achieve the effects of improving classification accuracy, reducing the time consumed by classification, and reducing computational complexity

Active Publication Date: 2015-07-01
XIDIAN UNIV
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the multi-class division and merging in Freeman decomposition, this method has high computational complexity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Polarized SAR (Synthetic Aperture Radar) image classifying method based on cooperative representation and deep learning.
  • Polarized SAR (Synthetic Aperture Radar) image classifying method based on cooperative representation and deep learning.
  • Polarized SAR (Synthetic Aperture Radar) image classifying method based on cooperative representation and deep learning.

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] refer to figure 1 , the specific implementation steps of the present invention are as follows:

[0028] Step 1, calculating the polarization covariance matrix C and the total power characteristic parameter S.

[0029] (1a) The polarization coherence matrix T of each 3*3 pixel point of the input polarization SAR image;

[0030] (1b) Calculate the polarization covariance matrix C of each pixel by the following formula: C=M*T*M',

[0031] In the formula, M=[1 / sqrt(2)]*m, m=[101; 10-1; 0sqrt(2)0], sqrt(2) represents the square root of 2, and M' represents the transposition matrix of M.

[0032] (1c) Using three elements T on the diagonal of T 11 , T 22 , T 33 Constituting the total power characteristic parameter: S=T 11 +T 22 +T 33 .

[0033] Step 2, extracting polarization features.

[0034] (2a) Decompose two scattering parameters, the scattering entropy H and the anti-entropy A, from the polarization coherence matrix T of each pixel through the Claude Cloude de...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a polarized SAR (Synthetic Aperture Radar) image classifying method based on cooperative representation and deep learning, and mainly solves the problems that an existing method is high in computation complexity and low in classification precision. The method comprises the realizing steps: 1, inputting a polarized SAR image, and extracting the polarization characteristics of the image; 2, selecting a training sample set according to practical ground features, and selecting pixel points of the entire image as a test sample set; 3, taking the characteristics of the training sample set as an initial dictionary, and learning the initial dictionary to obtain a learning dictionary by K-SVD (Singular Value Decomposition); 4, synergically representing the training sample set and the testing sample set to obtain the representation coefficients of the training sample set and the testing sample set by the learning dictionary; 5, deeply learning the representation coefficients of the training sample set and the testing sample set so as to obtain more essential characteristic representing; and 6, carrying out the polarized SAR image classification on the representation coefficients by an libSVM (Shared Virtual Memory) classifier after the deep learning. The SAR image classifying method provided by the utility model is low in computation complexity and high in classification accuracy, and is applicable to the polarized SAR image classification.

Description

technical field [0001] The invention belongs to the technical field of image processing, in particular to a polarization SAR image classification method, which can be used for ground object recognition. Background technique [0002] Radar is an active detection system that can work around the clock. It can penetrate a certain surface and change the frequency and intensity of emitted waves. Synthetic aperture radar (SAR) is a kind of imaging radar technology. It uses the relative motion of radar and target to synthesize a smaller real wireless aperture into a larger equivalent antenna aperture radar by means of data processing. It has all-weather and all-weather capabilities. Advantages of time and high resolution. Polarization SAR is a new type of radar used to measure echo signals. It can record the phase difference information of combined echoes in different polarization states, and can perform full polarization measurement and imaging of targets, which greatly improves t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
Inventor 焦李成马文萍汤玫王爽刘红英侯彪杨淑媛屈嵘马晶晶
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products