Polarization SAR ground object classification method based on self-step learning convolutional neural network

A convolutional neural network and ground object classification technology, which is applied in the field of polarimetric SAR ground object classification, ground object classification and target recognition, can solve the problems of insufficient image expression and affect the final result of classification, and achieves reduction of impact, Improve the classification accuracy and the effect of good classification

Active Publication Date: 2018-09-21
XIDIAN UNIV
View PDF23 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the disadvantage of this method is that there is still a lack of better selection criteria for the polarization characteristic parameters, and only using the polarization characteristic parameters of the SAR image cannot fully express the image, which will directly affect the final result of the classification

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Polarization SAR ground object classification method based on self-step learning convolutional neural network
  • Polarization SAR ground object classification method based on self-step learning convolutional neural network
  • Polarization SAR ground object classification method based on self-step learning convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The embodiments and effects of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0028] refer to figure 1 , the implementation steps of the present invention are as follows:

[0029] Step 1. Extract the polarimetric scattering matrix S and the pseudo-color RGB map under the Pauli basis.

[0030] Download the original polarimetric SAR data of Flevoland in Flevoland, the Netherlands from the Internet, and use polSARpro_v4.0 software to transform the original data to obtain the polarization scattering matrix S and the pseudo-color RGB image under the Pauli basis of the fully polarimetric SAR.

[0031] Step 2. Construct a sample set and select training samples and test samples.

[0032] This step is to form a three-dimensional matrix X for each pixel according to its polarization scattering matrix S, RGB values ​​​​in the pseudo-color map and neighboring pixel information, and use the three-dimensional matrix ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention discloses a polarization SAR ground object classification method based on a self-step learning convolutional neural network, in order to mainly solve the problems that the priorart has low accuracy in classifying complex ground object scenes and is heavily affected by noise. The implementation scheme comprises: 1, obtaining a polarization scattering matrix S and a pseudo color RGB image under the Pauli basis from original complete polarization SAR data; 2, constructing a three-dimensional matrix to form a sample set for each pixel, and constructing a training sample setand a test sample set; 3, constructing a convolutional neural network and training the convolutional neural network based on self-step learning to accelerate network convergence and improve the generalization ability of the network; and 4, classifying the test samples by using the trained convolutional neural network to obtain a final complete polarization SAR ground object classification result.According to the method disclosed by the present invention, accuracy for classifying the target ground objects of complex ground object scenes in the polarization SAR image is improved, and the methodcan be used for feature classification and target recognition.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a polarimetric SAR object classification method, which is applicable to object classification and target recognition. Background technique [0002] With the development of microwave remote sensing technology, high-resolution polarization synthetic aperture radar has become an inevitable trend in the development of SAR field, and polarization SAR image classification, as one of the important methods of polarization SAR image interpretation, has been widely used in national defense and civilian applications. and many other fields. Although high-resolution polarization SAR contains rich backscatter information, the current classification algorithms only use shallow polarization features, which cannot fully represent the complex scene information contained in the image. [0003] The classification of polarimetric SAR images involves many disciplines such as statistica...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06V20/13G06N3/045
Inventor 缑水平陈文帅王秀秀张晓鹏刘波焦李成白静马文萍马晶晶
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products