Foundation meteorological cloud picture classification method based on cross validation deep CNN feature integration

A technology of cross-validation and classification method, which is applied in the field of ground-based meteorological cloud image classification based on cross-validation deep CNN feature integration, can solve the problems of complex calculation of cloud recognition process, and unrobust results of single CNN feature cloud classification. Achieve efficient and robust adaptive cloud-like automatic classification, avoid image preprocessing, and reduce computational complexity

Inactive Publication Date: 2020-06-19
SHANXI UNIV
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problems of complex calculation in the cloud shape recognition process and unrobustness of the cloud shape classification result of a sing...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Foundation meteorological cloud picture classification method based on cross validation deep CNN feature integration
  • Foundation meteorological cloud picture classification method based on cross validation deep CNN feature integration
  • Foundation meteorological cloud picture classification method based on cross validation deep CNN feature integration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] The ground-based meteorological cloud image classification method based on cross-validation deep CNN feature integration in this embodiment first utilizes a single convolutional neural network model to extract the deep CNN features of ground-based meteorological cloud images, and then performs multiple resampling of CNN features based on cross-validation, Finally, based on the voting strategy of multiple cross-validation and resampling results, the ground-based cloud map cloud shape is identified. Before introducing the specific scheme, some basic concepts and operations are first introduced.

[0039] Data set: record a data set containing n ground-based meteorological cloud images as D n , then D n ={z i ,i=1,...,n}. Among them, z i is the data set D n The i-th cloud image in ;

[0040] Metric Set: Image Dataset D n The index set for each z i The set composed of the subscripts of is denoted as I={1,2,...,n};

[0041] Convolution: Through a convolution kernel, m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of ground-based meteorological cloud picture classification, and particularly relates to a ground-based meteorological cloud picture classification methodbased on cross validation deep CNN feature integration. According to the method, firstly, a convolutional neural network model is utilized to extract deep CNN features of a foundation meteorological cloud image, then multiple times of resampling of the CNN features is performed based on cross validation, and finally, identification of the cloud shape of the foundation cloud image is performed based on a voting strategy of multiple times of cross validation resampling results. According to the method, the ground-based meteorological cloud images are automatically classified, and an adaptive end-to-end automatic cloud recognition algorithm directly based on the original cloud images without any image preprocessing is realized. The proposed algorithm relates to the fields of computer vision,machine learning, image recognition and the like. The proposed algorithm fully overcomes the non-robustness of a single CNN feature cloud classification result and the high calculation overhead of multi-time deep convolutional neural network integration, and at the same time, ensures that the proposed algorithm has high classification accuracy and noise stability.

Description

technical field [0001] The invention belongs to the technical field of classification of ground-based meteorological cloud images, and in particular relates to a method for classification of ground-based meteorological cloud images based on cross-validation depth CNN feature integration. Background technique [0002] Deep convolutional neural network is one of the representative algorithms of deep learning. It is a probability model based on statistical learning. In its specific implementation algorithm, although the deep convolutional network is on the convolutional layer, pooling layer, and fully connected layer There are various variations, but from a broad perspective it always falls under the category of neural networks. In fact, the core of deep convolutional neural network is feature learning, which adaptively learns a large amount of feature information through convolution and pooling operations, and solves the defect of manually designing features in traditional mac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62
CPCG06F18/2414G06F18/259G06F18/25G06F18/253
Inventor 王钰章豪东杨杏丽李济洪
Owner SHANXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products