Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof

A dynamic neural network and model training technology, applied in biological neural network models, neural learning methods, neural architectures, etc., can solve the problem of inability to effectively describe the internal input of the system, inability to guarantee the performance of neural network classification, convolution functions and pooling functions The design is complicated and difficult to achieve the effect of saving training time, reducing design difficulty and improving accuracy

Inactive Publication Date: 2017-12-15
SHANDONG NORMAL UNIV
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this model, the depth of the model and the design of the convolution function and the pooling function are extremely complicated and difficult problems. In addition, the current neural network and its various variants all model neurons as static neurons. The element can only describe the relationship between the input and the output, but cannot effectively describe the relationship between the system inputs, and cannot guarantee the classification performance of the neural network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof
  • Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof
  • Dynamic neutral network model training method based on ensemble learning and dynamic neutral network model training device thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] Based on the above dynamic neurons, this embodiment provides a training method for a dynamic neural network model based on integrated learning, including the following steps:

[0057] Step 1: The original data is used as the input of the neurons in the first layer of the i-th sub-model, and after being processed by the dynamic neural network, the corresponding output value is the feature of the layer, i=1,2,...,k;

[0058] Optionally, the original data can also be divided into a training set and a verification set, the training set is used for training the neural network model, and the verification set is used for subsequent model performance evaluation.

[0059] Step 2: Increase the number of neuron layers, use the output features of the upper layer as the input of the next layer of neurons to obtain the features of the corresponding layer, repeat this step until the number of layers reaches a certain preset value r;

[0060] The same layer in the same neural network h...

Embodiment 2

[0076]Based on the method of Embodiment 1, the present invention also provides a computer device, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the computer program being used to train the integrated dynamic neural network model, the processor executes the following steps when executing the program:

[0077] Step 1: For each sub-model in the integrated dynamic neural network model, the original data is used as the input of the neurons of the first layer, and the output characteristics of the layer are obtained through the processing of the dynamic neurons;

[0078] Step 2: Increase the number of neuron layers, use the output features of the upper layer as the input of the next layer of neurons to obtain the features of the corresponding layer, and repeat this step until the number of layers reaches a certain preset value;

[0079] Step 3: Establish a fully connected layer between the output feature of the last layer...

Embodiment 3

[0091] A computer-readable storage medium on which a computer program is stored for training an integrated dynamic neural network model, including a memory, a processor, and a computer program stored on the memory and operable on the processor, the program being processed The following steps are performed when the server executes:

[0092] Step 1: For each sub-model in the integrated dynamic neural network model, the original data is used as the input of the neurons of the first layer, and the output characteristics of the layer are obtained through the processing of the dynamic neurons;

[0093] Step 2: Increase the number of neuron layers, use the output features of the upper layer as the input of the next layer of neurons to obtain the features of the corresponding layer, and repeat this step until the number of layers reaches a certain preset value;

[0094] Step 3: Establish a fully connected layer between the output feature of the last layer and the category to which it ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a training method and device for a dynamic neural network model based on integrated learning, the method comprising: for each sub-model in the integrated dynamic neural network model, the original data is used as the input of the first layer of neurons, after The processing of dynamic neurons obtains the output features of this layer; increases the number of neuron layers, uses the output features of the upper layer as the input of the next layer of neurons, and obtains the features of the corresponding layer, and repeats this step until the number of layers reaches a certain preset value ; Establish a fully connected layer between the output feature of the last layer and the category to which it belongs, and calculate the fully connected weight between the output feature and the category; establish a fully connected layer between each sub-model and the category to which it belongs, and determine each The weight of the sub-model for the integrated dynamic neural network model; the present invention converts a deep neural network into multiple relatively shallow neural networks for parallel processing, saving training time and improving training efficiency.

Description

technical field [0001] The invention relates to the fields of artificial intelligence and big data, in particular to a model training method and device for object classification. Background technique [0002] Artificial intelligence has become a research hotspot in the current society. From time to time, new research results will catch people's attention and become a hot favorite in today's society. Among them, the correct classification and identification of objects has become an important research direction of artificial intelligence. At present, object recognition has made considerable progress, largely due to artificial neural networks and their many variants, such as convolutional neural networks, recurrent neural networks, etc. The neural network trains its internal structure through a large amount of data to achieve a strong expression effect of the model. However, the increase in the amount of data further increases the difficulty of model training. In order to sol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/04
Inventor 王强张化祥孟庆田马学强任玉伟
Owner SHANDONG NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products