Check patentability & draft patents in minutes with Patsnap Eureka AI!

Neural network model input parameter dimension reduction method and computer readable storage medium

A neural network model and input variable technology, applied in the field of neural network, can solve the problems of long modeling time and low model accuracy, achieve the effects of improving accuracy and efficiency, reducing convergence time, and reducing the probability of overfitting

Pending Publication Date: 2020-12-25
XIAMEN UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the input parameters of the model are many and the input parameters are not independent of each other, the neural network is prone to overfitting phenomenon, which leads to problems such as low precision of the established model and long modeling time.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network model input parameter dimension reduction method and computer readable storage medium
  • Neural network model input parameter dimension reduction method and computer readable storage medium
  • Neural network model input parameter dimension reduction method and computer readable storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056]Please refer tofigure 1 , The first embodiment of the present invention is: a neural network model input variable dimensionality reduction method, which can be applied to the dimensionality reduction of the input variable of the binary neural network prediction model, such asfigure 1 As shown, including the following steps:

[0057]S1: Obtain sample data, where the sample data includes positive sample data and negative sample data, and each sample data is composed of multiple variable data.

[0058]For example, if the sample data is red tide monitoring data, the monitoring data during the red tide occurrence period is marked as red tide data, that is, positive sample data, and other monitoring data is marked as non-red tide data, that is, negative sample data. The red tide monitoring data includes 25 variables such as water temperature, salinity, and chlorophyll.

[0059]Further, in order to reduce the interference caused by the magnitude difference between different variables, the sam...

Embodiment 2

[0108]Please refer toFigure 2-3This embodiment is a specific application scenario of the first embodiment. In this embodiment, the dimensionality reduction of the input parameters of the short-term red tide forecast model of the binary BP network is taken as an example for description.

[0109]1. Collect the red tide monitoring data of a certain place as sample data, a total of 20,752 groups, mark the monitoring data during the occurrence of red tide as red tide data (positive sample data), a total of 3425 groups, and mark other monitoring data as non-red tide data (negative sample) Data), a total of 17,327 groups. It contains 25 variables such as water temperature, salinity, and chlorophyll.

[0110]2. Set the code length of the genetic algorithm to 25. The 25 bits of the chromosome correspond to 25 variables one-to-one. The value of each gene can only be 1 or 0. If the value of a certain bit of the chromosome is 1, It means that the variable corresponding to this position participates i...

Embodiment 3

[0143]This embodiment is a computer-readable storage medium corresponding to the above-mentioned embodiment, on which a computer program is stored, and when the program is executed by a processor, the following steps are implemented:

[0144]Acquiring sample data, where the sample data includes positive sample data and negative sample data, and each sample data is composed of multiple variable data;

[0145]Divide the sample data into training data and test data according to a preset ratio;

[0146]Randomly generate a preset number of initial string structure data to obtain an initial population. Each bit in the initial string structure data corresponds to each variable in the sample data one-to-one, and the value of each bit is the first character Or the second character;

[0147]Calculate the Heidke skill score corresponding to each string structure data in the latest population, and use the Heidke skill score corresponding to each string structure data as the fitness of each string structure...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network model input parameter dimension reduction method and a computer readable storage medium. The method comprises the steps of obtaining sample data; dividing thesample data into training data and test data according to a preset proportion; randomly generating a preset number of initial string structure data to obtain an initial population, each bit in the initial string structure data being in one-to-one correspondence with each variable in the sample data, and the value of each bit being a first character or a second character; respectively calculating aHeidke skill score corresponding to each string of structure data in the latest population, and taking the Heidke skill score as the fitness of each string of structure data; if the string structuredata of which the fitness is greater than or equal to the preset target value exists, taking a variable corresponding to the bit of which the value is the first character in the string structure dataas a final modeling variable; and if not, generating new string structure data according to a genetic algorithm to obtain a new population, and continuing to calculate the fitness. The invention can improve the precision and efficiency of the neural network model.

Description

Technical field[0001]The invention relates to the technical field of neural networks, in particular to a method for reducing the dimensionality of input parameters of a neural network model and a computer-readable storage medium.Background technique[0002]When using statistical analysis methods to study multivariate topics, too many variables will increase the complexity of the topic. People naturally hope that fewer variables will get more information. In many cases, there is a certain correlation between variables. When there is a certain correlation between two variables, it can be explained that the two variables reflect the information of the subject to a certain degree of overlap. For example, when studying red tide prediction models, it may be necessary to collect meteorological data, hydrological data, water quality data, nutrient salt data, and tide data. It may contain dozens of parameters, some of which have certain correlations. There is a certain repetition in the respon...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/04G06N3/086
Inventor 张彩云丁文祥李雪丁张友权李星郑祥靖郭民权丁萍陈金瑞朱本璐任在常
Owner XIAMEN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More