Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Pruning method of deep convolutional neural network, computer equipment and application method

A deep convolution and neural network technology, applied in the field of deep convolutional neural networks, can solve problems such as inability to accelerate the network

Active Publication Date: 2020-09-29
STATE GRID ZHEJIANG ELECTRIC POWER +1
View PDF2 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] In view of the above-mentioned shortcomings and deficiencies of the prior art, the present invention provides a deep convolutional neural network pruning method, which solves the technical problem that the unstructured pruning method in the prior art cannot realize network acceleration on a general hardware platform

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pruning method of deep convolutional neural network, computer equipment and application method
  • Pruning method of deep convolutional neural network, computer equipment and application method
  • Pruning method of deep convolutional neural network, computer equipment and application method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0106] Such as figure 1 As shown, the embodiment of the present invention shows a processing process based on a pruned deep convolutional neural network applied to any image data set. The specific processing process may include the following steps:

[0107] S1. Before processing the image data set to be classified, first obtain the training data set corresponding to the image data set;

[0108] Usually, the training data set may include a training set and a test set, and the pictures in the training set and the test set are pre-labeled;

[0109] S2. Use the training data set to train the convolutional neural network model without pruning, and obtain the trained convolutional neural network model / convolutional neural network structure;

[0110] S3. Pruning the trained convolutional neural network model using a convolutional neural network pruning method to obtain a trained and pruned convolutional neural network model, which is a lightweight model;

[0111] S4. Using a lightwei...

Embodiment 2

[0117] Such as figure 2 As shown, the embodiment of the present invention provides a pruning method of a deep convolutional neural network, and the method may include the following steps.

[0118] P1. For the deep convolutional neural network to be processed, use the convolution kernel spectral clustering algorithm to perform intra-layer convolution kernel pruning on the convolution kernel of each layer in the deep convolutional neural network, and obtain the pruning value of the layer deep convolutional neural network;

[0119] P2. Use the global pruning method based on gradient information to perform global convolution kernel pruning on the deep convolutional neural network pruned in the layer, and obtain the deep convolutional neural network with global pruning;

[0120] P3. Calculate the total pruning rate of the deep convolutional neural network of global pruning;

[0121] In this embodiment, the total pruning rate of the current round is equal to the sum of the prunin...

Embodiment 3

[0131] An embodiment of the present invention provides a pruning method of a deep convolutional neural network, such as image 3 As shown, the execution subject of the method of this embodiment can be any computer device / computing platform, and the computer device / computing platform of this embodiment executes the steps of stage A, stage B and stage C in sequence, and then obtains the final pruned lightweight convolutional neural network.

[0132] In this embodiment, the process of phase A is the above figure 2 Detailed description of steps in P1, B as above figure 2 Detailed description of steps in P2, C as above figure 2 Detailed description of steps P3, P4 and P5 in .

[0133] The process of Phase A is detailed as follows:

[0134] Step A01. Obtain the spectral clustering pruning rate for the i-th round of intra-layer pruning processing ;

[0135] Step A02, for the deep convolutional neural network to be processed in the i-th round, perform a convolution kernel sp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a pruning method for a deep convolutional neural network, computer equipment and an application method, and the method comprises the steps: (1), taking a layer as a unit, andexecuting a convolution kernel spectral clustering algorithm; (2) carrying out intra-layer convolution kernel pruning, dynamically calculating a spectral clustering pruning rate of this round, cuttingoff convolution kernels of non-clustering centers, and finely adjusting a certain number of rounds to recover precision; (3) calculating the importance of the convolution kernel by taking the globalsituation as a unit; (4) carrying out global convolution kernel pruning, calculating the global pruning rate of the current round dynamically, cutting off convolution kernels with low importance, andadjusting a certain round number finely to recover precision; (5) circularly executing the steps (1) to (4) until the target pruning rate is reached; and (6) keeping the weight of the descending convolution kernel as initialization, and continuing to finely adjust a certain number of rounds to obtain a final pruning network. According to the method, the problem that a large number of redundant parameters exist in an existing deep convolutional neural network is solved, and compression and acceleration of the network are achieved on a universal hardware platform.

Description

technical field [0001] The invention relates to the technical field of deep convolutional neural networks, in particular to a pruning method for deep convolutional neural networks. Background technique [0002] In recent years, with the rapid development of deep learning, it has made significant progress in computer vision-related fields such as image classification, object detection, and semantic segmentation, as well as natural language processing-related fields such as speech recognition, machine translation, and semantic sentiment analysis. The core technology in deep learning is the deep convolutional neural network. However, in the case of continuous breakthroughs in the performance of deep convolutional neural networks, it has been accompanied by ever-increasing network width and depth. For example, from the 61 million parameters and 729 million floating-point operations of the 8-layer AlexNet to the 138 million network parameters and 15.5 billion floating-point oper...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08G06K9/62
CPCG06N3/082G06N3/045G06F18/23G06F18/241
Inventor 丁麒李飞飞叶翔胡若云侯素颖叶盛沈然郭兰兰陈彤谢裕清邓建丽
Owner STATE GRID ZHEJIANG ELECTRIC POWER
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products