Dynamic neural network model training method and device

A technology of dynamic neural network and neural network model, which is applied in the training field of dynamic neural network model, can solve problems such as the inability to describe the relationship between systems, and achieve the effects of reducing training difficulty, improving algorithm efficiency, and reducing design difficulty

Inactive Publication Date: 2017-09-15
SHANDONG NORMAL UNIV
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Static neurons can only describe the relationship between input and output, but cannot describe the relationship between system inputs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic neural network model training method and device
  • Dynamic neural network model training method and device
  • Dynamic neural network model training method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0058] Based on the method in Embodiment 1, the present invention also provides a dynamic neural network model training device, which includes a model building module, a model evaluation module and a model optimization module. in:

[0059] The model building module is used for the initialization of the neural network model. The original one-dimensional data is input to the first layer of neurons, and the corresponding output value is the feature of this layer; the number of neuron layers is increased, and the features output by the upper layer are used as the next layer of neurons Element input to get the characteristics of the corresponding layer, repeat this step until the number of layers reaches the preset value;

[0060] In the neural network model, neurons in the same layer have the same dynamic structure, and the dynamic structures of neurons in different layers can be the same or different; there is no connection between neurons in the same layer, and the connection mo...

Embodiment 3

[0073] Taking the speech data as an example, the speech data is a time series data, and the initial data is a very long time-related vector, and the speech data is used as the training data to train the neural network model.

[0074] Input the voice data into the initialized neural network model, shorten the length of the vector to a certain value (for example, the length k to 50) layer by layer, and then use BP backpropagation to determine the full range of the layer and the number of discussion topics. Connect the weight coefficients; test the performance on the verification set, then add a layer of neurons to obtain new features, use the BP backpropagation algorithm to train the corresponding weight parameters, and test whether the performance has improved significantly on the verification set, and if so, then Continue to increase the neural network layer, otherwise, stop the increase in the number of model layers. If the number of layers increases to a certain threshold an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a training method and device for a dynamic neural network model. The training method includes: inputting original one-dimensional data into the neurons of the first layer, and the corresponding output value is the feature of the layer, and then increasing the number of neuron layers, adding The features output by the upper layer are used as the input of the next layer of neurons to obtain the features of the corresponding layer, and this step is repeated until the number of layers reaches the preset value; a fully connected layer is established between the final output feature and the classified category, and reversed through BP The propagation algorithm determines the connection coefficients between the fully connected layers; among them, the neurons are modeled as dynamic neurons; the performance of the model is evaluated, and if the performance meets expectations, the training ends; otherwise, continue to add new neuron layers to the generated network model until the model performance meets expectations. The method of the invention can extract data features more efficiently, thereby improving training efficiency.

Description

technical field [0001] The invention relates to the field of big data analysis and processing, in particular to a training method and device for a dynamic neural network model used for one-dimensional data classification. Background technique [0002] With the advent of the era of big data, a large amount of data such as text, voice, image, and video is generated every day. Even a small industry or even an enterprise generates a large amount of data every day. How to analyze and process the data, and dig out and discover the internal connections and laws of things has very important practical significance. However, due to the large amount of generated data, it is quite difficult to analyze and process, and this problem is not a small difficulty and challenge. In the process of big data processing, the current main research method is to extract the relevant characteristics of the research object through machine learning and deep learning, and establish a number of rules for ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08
CPCG06N3/084
Inventor 王强张化祥房晓南王振华郭培莲
Owner SHANDONG NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products