Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolutional neural network channel pruning method based on model fine tuning

A convolutional neural network and model technology, applied in the field of channel pruning based on model fine-tuning, can solve the problems of reducing model effect, inability to pre-train the model to initialize network parameters, and small dataset size.

Pending Publication Date: 2020-11-13
BEIJING INST OF COMP TECH & APPL
View PDF0 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method leads to the fact that when training the model, the pre-trained model trained on a large data set cannot be used to initialize the network parameters, because the parameters of the deeper layers of the trained model are usually small, and all channels of a certain convolutional layer will be cut. The case of falling
In engineering applications, the size of the data set used is usually small. If the pre-training model cannot be used, the effect of the model will be greatly reduced.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network channel pruning method based on model fine tuning
  • Convolutional neural network channel pruning method based on model fine tuning
  • Convolutional neural network channel pruning method based on model fine tuning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to make the purpose, content, and advantages of the present invention clearer, the specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0021] figure 1 It is a flow chart of a convolutional neural network channel pruning method based on model fine-tuning provided by the present invention, such as figure 1 As shown, a channel pruning method of convolutional neural network based on model fine-tuning, including:

[0022] Step 1: construct a convolutional neural network classification model, and train the model C on the ImageNet image classification data set;

[0023] Step 2: modify the model C, set its number of categories as the target category, and perform sparsity training on the target data set to obtain the converged model C';

[0024] Step 3: Pruning the sparsely trained model C′ according to the channel pruning strategy to obtain the pruned model...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a convolutional neural network channel pruning method and system based on model fine tuning. The method comprises the steps: constructing a convolutional neural network classification model, wherein the model is composed of a feature extractor and a classifier, the feature extractor comprises a convolution layer and a pooling layer, training is carried out on an Image Netimage classification data set, and a pre-training model C is obtained; modifying a classifier of the pre-training model C, setting the category number output by a classifier full connection layer as atarget category, and performing sparsity training on a target data set to obtain a converged model C '; pruning the sparsely trained model C'according to a channel pruning strategy to obtain a prunedmodel C ''; carrying out fine adjustment on the pruned model C ''on a target data set so as to improve the performance of the pruned model. The invention provides the convolutional neural network channel pruning method and system based on the model fine tuning, the global sorting of batch normalization layer scaling parameters is carried out, the channel retention rate of each to-be-pruned network layer is limited, and a to-be-pruned channel is selected in a network;

Description

technical field [0001] The invention relates to the technical field of deep learning convolutional neural network pruning, in particular to a channel pruning method based on model fine-tuning. Background technique [0002] Convolutional neural network channel pruning technology is a technique to reduce the time and space complexity of the network model. This method achieves the purpose of improving the model reasoning speed and reducing the model size by adjusting the trained model structure and removing redundant channels. The key point of channel pruning technology is how to select the channel to be cut so that the performance of the model does not drop significantly after pruning. At present, the more commonly used method is the Network Sliming method proposed by Zhuang Liu et al. in 2017. This method can achieve the premise that the accuracy of the classification model is basically unchanged by performing sparse training on the network, global pruning and fine-tuning th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/082G06N3/045
Inventor 刘洪宇杨林
Owner BEIJING INST OF COMP TECH & APPL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products