Computer neural network modified according to pruning method

A neural network and computer technology, applied in the field of neural networks, to achieve the effect of reducing storage space and computing resources, and strong sparsity

Inactive Publication Date: 2017-05-31
NANJING UNIV
View PDF0 Cites 27 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to get better results, more and more complex neural n

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computer neural network modified according to pruning method
  • Computer neural network modified according to pruning method
  • Computer neural network modified according to pruning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0069] In this embodiment, the process of pruning the neural network by mining important itemsets is as follows:

[0070] 1. Input the neural network to be pruned. The neural network has two layers, the input layer has four nodes, the second layer has five nodes, and the output layer has one node.

[0071] 2. Prune the first layer of four by five network layers. The input weight matrix of the first layer of neural network is shown in Table 1:

[0072] Table 1

[0073] 0.82 -0.13 0.01 0.04 -0.23 0.31 -0.81 0.24 0.13 0.12 0.23 0.24 0.43 -0.12 -0.12 0.13 0.41 0.51 0.15 -0.43

[0074] 3. According to step 1, first construct the itemset of the neural network, first take the absolute value of the weight, and then take the nodes whose weight is greater than the threshold 0.2 for each layer, as shown in Table 2 below:

[0075] Table 2

[0076] 1 0 0 0 1 1 1 1 0 0 1 1 1 0 0 0 1 1 0 1

[0077] 4. Accor...

Embodiment 2

[0087] In this embodiment, the mining of important itemsets is used to pruning the autoencoder network (Autoencoder) and the running process is as follows:

[0088] 1. The user inputs a given autoencoder network, whose network structure is 784→128→64→128→784 and the training sample set MNIST. The role of the autoencoder network is to compress the input samples to the hidden layer and reconstruct the samples at the output. That is to say, there is the following relationship between the output layer of the autoencoder network and the input layer:

[0089]

[0090] The self-encoding network can be regarded as compressing the data (compressed from the original "n-dimensional" to "m-dimensional" where m is the number of neurons in the hidden layer), and then recovering the data with as little loss as possible when needed come out.

[0091] 2. Pruning the self-encoder network, the pruning results are shown in Table 5 below:

[0092] table 5

[0093]

[0094] The third row ...

Embodiment 3

[0097] In this implementation, the operation process of pruning handwritten digit recognition using important item set mining is as follows:

[0098] 1. The user inputs a given neural network and data set, the network structure is a fully connected neural network of 784→300→100→10, and the data set is the handwritten digit data set MNIST. The user's task is to recognize digits in an image.

[0099] 2. Pruning the neural network, the results of the pruning are shown in Table 6:

[0100] Table 6

[0101]

[0102] The third row is the result of this method, the first three columns are the compression ratio of each layer of the neural network, the fourth column is the overall compression ratio of the neural network, Accuracy is the recognition accuracy, and the fifth column is the accuracy before pruning. The sixth column is the accuracy rate after pruning (prune). It can be seen that this method reduces the neural network to the original 7.76%, and the accuracy of handwritt...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a computer neural network modified according to a pruning method. The method includes: as for a pre-trained neural network, constructing a plurality of item sets from network parameters, adopting an improved similar Apriori algorithm for frequency item set mining according to significances of the item sets to obtain a set of important item sets, and constructing a pruned neutral network according to the obtained item sets, and finally re-training the pruned neural network to obtain a final result.

Description

technical field [0001] The invention relates to a pruning method for a neural network in a computer, in particular to a neural network improved by using the pruning method in a computer. Background technique [0002] Neural network is a recent research hotspot in machine learning. It is an algorithmic mathematical model that imitates the behavior characteristics of animal neural networks and performs distributed parallel information processing. This kind of network depends on the complexity of the system, and achieves the purpose of processing information by adjusting the interconnection relationship between a large number of internal nodes. One of its functions is that it can automatically learn the combination relationship of original features. This combination relationship is often beneficial for classification or prediction (such as judging whether the current image is a cat, whether the current speech signal is a given word, etc.). However, although the traditional ne...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/08
CPCG06N3/082
Inventor 黄书剑窦子轶戴新宇陈家骏张建兵
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products