Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing image scene classification method based on deep convolutional neural network and multi-kernel learning

A convolutional neural network and multi-core learning technology, applied in character and pattern recognition, instruments, computing, etc., can solve the problems of difficulty in selecting classifier parameters, redundancy, and incomplete coverage of feature information, so as to optimize the classification process, strengthen the Expressive and robust effects

Active Publication Date: 2018-11-06
HOHAI UNIV
View PDF8 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] (1) The feature extraction process is complicated: the traditional classification method needs to use different image feature extraction algorithms to extract various types of features of the image for subsequent image classification. The feature extraction process is complicated, and the extracted features may contain information. Phenomena such as incompleteness and redundancy, resulting in low classification accuracy;
[0009] (2) The feature expressiveness is not strong: the existing remote sensing image scene classification methods usually only use one or two types of features as the input of the classifier. When the scene image is too complex and has many categories, the feature expressiveness is not strong. Weakened the performance of classification
[0010] (3) The selection of classifier parameters is difficult: commonly used image classifiers, such as SVM, KNN (K-Nearest Neighbor) and other parameter settings have a great impact on the performance of classification. In order to obtain better classification results, a lot of manual participation is required. The selection of the optimal parameters leads to poor versatility of the algorithm

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing image scene classification method based on deep convolutional neural network and multi-kernel learning
  • Remote sensing image scene classification method based on deep convolutional neural network and multi-kernel learning
  • Remote sensing image scene classification method based on deep convolutional neural network and multi-kernel learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The technical solutions of the present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0063] Such as figure 1 Shown, technical scheme of the present invention is described in further detail as follows:

[0064] (1) Use the deep convolutional neural network to train the remote sensing scene image, and use the output of the two fully connected layers learned as the features of the remote sensing scene image. These features include the underlying features of the remote sensing scene image. Such features are obtained through The middle-level features obtained by the front-end convolutional layer of the deep convolutional neural network, such features are obtained through the middle convolutional layer of the deep convolutional neural network, and high-level features, such features are obtained through the back-end convolution of the deep convolutional neural network layer obtained.

[0065] (1.1) Construct remote...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a remote sensing image scene classification method based on a deep convolutional neural network and multi-kernel learning. The remote sensing image scene classification methodincludes the steps: training remote sensing scene images through the deep convolutional neural network, and taking the output of two full connection layers which are obtained through learning as the characteristics of the remote sensing scene images; utilizing multi-kernel learning to the kernel function being suitable for the characteristics of the two full connection layers, so as to map the extracted characteristics of the two full connection layers to a higher space to realize adaptive fusion of the characteristics of the two full connection layers in the higher space; and finally, designing a multi-kernel learning-support vector machine to effectively classify the remote sensing scene images. The remote sensing image scene classification method based on a deep convolutional neural network and multi-kernel learning performs characteristic extraction on the remote sensing images through the deep convolutional neural network, and the deep characteristic coverage information obtainedthrough learning is complete and has relatively higher discriminating performance, at the same time, the characteristics are fused into the multi-kernel learning framework, so that preferable classification performance can be obtained.

Description

technical field [0001] The invention belongs to the field of image processing, in particular to a remote sensing image scene classification method based on deep convolutional neural network and multi-core learning. Background technique [0002] Remote sensing image scene classification is a research hotspot in the field of remote sensing, and it can be applied to many military and civilian fields. With the continuous advancement of remote sensing technology, the spatial resolution of remote sensing images has been continuously improved, making the details of ground objects contained in remote sensing images more obvious, and the spectral characteristics of ground objects more complex, which led to the early method of using spectral features for scene classification. For example, the classification accuracy rate of the maximum likelihood method, the minimum distance method, and the K-means clustering method decreases. [0003] In recent years, with the continuous development...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/213G06F18/2411G06F18/253
Inventor 王鑫李可吕国芳
Owner HOHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products