Hyper-parameter determination method for critical convolutional layer of remote-sensing classification convolution neural network

A technology of convolutional neural network and classification method, applied in the field of determination of key convolutional layer hyperparameters of remote sensing classification convolutional neural network, to reduce time and improve classification accuracy

Inactive Publication Date: 2017-07-25
WUHAN UNIV OF TECH
View PDF3 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the convolutional neural network is applied to remote sensing image classification, there is still a problem of how to determine the hype...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hyper-parameter determination method for critical convolutional layer of remote-sensing classification convolution neural network
  • Hyper-parameter determination method for critical convolutional layer of remote-sensing classification convolution neural network
  • Hyper-parameter determination method for critical convolutional layer of remote-sensing classification convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The present invention will be further described below in conjunction with specific examples and accompanying drawings.

[0049] The present invention aims to solve the problem of how to determine the convolutional neural network hyperparameters (convolution kernel size and step size) according to different resolution input remote sensing images, and provides a key layer for object-oriented remote sensing classification convolutional neural network based on image input The hyperparameter determination method, and in the convolutional neural network, the first layer of hyperparameters is particularly critical. The purpose of the convolution operation is to extract different features of the input. The first layer of the convolutional layer may only extract some low-level features such as edges. , lines and angles, etc., but it determines whether more layers of the network can iteratively extract more complex features from low-level features. The more complex and reliable fea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a hyper-parameter determination method for the critical convolutional layer of a remote-sensing classification convolution neural network. The method comprises the steps of constructing a convolutional neural network sample set; constructing a convolutional neural network structure; deterring the hyper-parameters of the critical layer of the convolutional neural network; selecting one convolutional layer as the critical layer, presetting the convolutional kernel size of the critical layer, and calculating the convolution scale; based on the convolutional kernel of the critical layer and the convolution scale, calculating the convolution step length according to a preset rule; presetting the convolutional kernels of other convolutional layers to be kernel size, and presetting the convolution step lengths of other convolutional layers to be 1; and conducting the mean-value down-sampling or the maximum-value down-sampling as the subsequent down-sampling. According to the technical scheme of the invention, based on the image input size and the convolution kernel size, the convolution scale concept is proposed and is adaptive to the remote-sensing spatial scale. On the above basis, an input size and convolution scale-based method for jointly determining the hyper-parameters of the critical layers is provided. In this way, the parameter adjustment time required for the algorithm is reduced, and the object-oriented remote-sensing classification precision is improved.

Description

technical field [0001] The invention relates to the field of remote sensing classification, in particular to a method for determining key convolutional layer hyperparameters of a remote sensing classification convolutional neural network. Background technique [0002] Geographic target recognition through remote sensing images is an important link in the application of remote sensing technology to practical problems. Whether it is thematic information extraction, dynamic change monitoring, thematic mapping, or remote sensing database construction, remote sensing image classification technology is inseparable. Object-oriented remote sensing technology gathers many adjacent pixels to form a spatial image object with certain semantics. It integrates the spatial, texture and spectral information contained in remote sensing data. Its core is image object construction and image object classification. Convolutional neural network is a deep learning algorithm. One of them, the algor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 崔巍郑振东周琪
Owner WUHAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products