Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Self-increasing and decreasing method of neural network nodes, computer equipment and storage medium

A technology of computer equipment and neural network, which is applied in the field of neural network, can solve the problems of increasing the complexity of network structure, slow learning speed of neural network, and the inability of network to have learning ability and information processing ability, so as to reduce the amount of useless calculation and improve learning. effect of ability

Active Publication Date: 2021-08-06
XIAMEN KUAISHANGTONG INFORMATION TECH CO LTD
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Experiments show that once the number of hidden layer nodes of the neural network is too small, the network cannot have the necessary learning ability and information processing ability
Conversely, if the number of hidden layer nodes of the neural network is too large, it will not only greatly increase the complexity of the network structure (this is especially important for hardware-implemented networks), but also the neural network is more likely to fall into local minimum points during the learning process, and will Make the neural network learn very slowly

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Self-increasing and decreasing method of neural network nodes, computer equipment and storage medium
  • Self-increasing and decreasing method of neural network nodes, computer equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] An embodiment of the present invention provides a neural network node self-increasing method, which is applied to neural network training. The neural network node self-increasing method includes the following steps, as shown in the attached figure 1 Shown:

[0040] Step S11, designing the neural network structure according to requirements, and using the data to train the neural network model, so that the neural network model converges or the number of iterations of the neural network exceeds a set threshold;

[0041] Step S12, adding and subtracting neurons layer by layer;

[0042] Mark the current neural network model as Mo, judge whether there are neurons that can be subtracted in the current layer, if there are neurons that can be subtracted in the current layer, then enter step S13, otherwise, enter step S15;

[0043] Among them, the current layer is the i-th layer, and the current node is the j-th node in the i-th layer, which is recorded as neuron Xij. Judging ...

Embodiment 2

[0053] An embodiment of the present invention provides a neural network node self-increasing method, which is applied to neural network training. The neural network node self-increasing method includes the following steps, as shown in the attached figure 2 Shown:

[0054] Step S21, designing the neural network structure according to requirements, and using the data to train the neural network model, so that the neural network model converges or the number of iterations of the neural network exceeds a set threshold;

[0055] Step S22, adding and subtracting neurons layer by layer;

[0056] Mark the current neural network model as Mo, and judge whether there are neurons that can be subtracted in the current layer. If there are neurons that can be subtracted in the current layer, then enter step S23, otherwise, enter step S25;

[0057] The current layer is the i-th layer, and the next layer is the (i+1)-th layer, where j≠j';

[0058] Get the jth neuron Xij of the i-th layer an...

Embodiment 3

[0069] In an embodiment of the present invention, a computer device is also provided, including at least one processor, and a memory connected to the at least one processor in communication, and the memory stores instructions executable by the at least one processor, so The instructions are executed by the at least one processor, so that the at least one processor executes the self-increasing and decrementing method of the neural network node, the self-increasing and decrementing method of the neural network node is the same as that in Embodiment 1 or 2, and the present invention is based on This will not be repeated here.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network node self-increasing and decreasing method, computer equipment and a storage medium. The method is a neural network structure designed according to requirements, and uses data to train the neural network model, so that the neural network model converges or the neural network The number of iterations of the network exceeds a certain threshold; and layer-by-layer neurons are increased or decreased. The present invention automatically increases or decreases the neural network nodes, on the one hand, improves the learning ability of the neurons, and on the other hand reduces the amount of useless calculations of the neurons .

Description

technical field [0001] The invention relates to the field of neural networks, in particular to a method for self-increasing and decreasing neural network nodes, computer equipment and storage media. Background technique [0002] Neural Networks (NNs for short), or Connection Model, is an algorithmic mathematical model that imitates the behavioral characteristics of animal neural networks and performs distributed parallel information processing. This kind of network depends on the complexity of the system, and achieves the purpose of processing information by adjusting the interconnection relationship between a large number of internal nodes. [0003] A neural network is composed of many neurons, each neuron is also called a unit or node, and neurons are connected together to form a network. Usually, neurons form multiple layers, divided into three types of layers, input layer, hidden layer, and output layer, where the first layer is the input layer, and the first layer can ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04
CPCG06N3/04
Inventor 洪国强肖龙源蔡振华李稀敏刘晓葳谭玉坤
Owner XIAMEN KUAISHANGTONG INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products