Neural network pruning method and device, equipment and storage medium

A neural network and pruning technology, applied in the field of deep neural network compression and acceleration, can solve problems such as unfavorable multi-branch structure compression ratio, no compression processing, etc., to speed up task processing speed, reduce calculation amount, and improve compression ratio. Effect

Pending Publication Date: 2021-11-26
LANGCHAO ELECTRONIC INFORMATION IND CO LTD
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, the convolutional neural network has evolved a variety of multi-branch structures, but the pruning schemes for multi-branch structures only cut the input and output channels of the middle layer of the bottleneck structure. For the input and output of the entire multi-branch structure The channel is not compressed, and the number of channels in the middle layer of the bottleneck structure of the multi-branch structure is originally smaller than the number of input and output channels of the entire module. Therefore, only compressing the middle layer of the multi-branch structure is not conducive to improving the compression ratio of the multi-branch structure.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network pruning method and device, equipment and storage medium
  • Neural network pruning method and device, equipment and storage medium
  • Neural network pruning method and device, equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0061] The present invention proposes an asynchronous pruning method for convolutional neural networks based on kernel set theory, which can use the screening criteria based on kernel set theory layer by layer in the forward reasoning process of the neural network to cut channels, and optimize the feature map weight The structure error directly obtains the new weight of the compressed convolution kernel, and an asynchronous compression process is designed for multi-...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a neural network pruning method and device, equipment and a storage medium. When the neural network is pruned, a network layer to be pruned can be used as a target network layer for channel pruning and convolution kernel reconstruction. Therefore, when the multi-branch structure is compressed by the scheme, the compression is not limited to an intermediate layer; network layers such as an input layer, an output layer, a down-sampling layer and the like can be compressed, so that the compression ratio of the neural network is greatly improved, the calculation amount required by a neural network model to execute tasks is reduced, and the task processing speed of the neural network is increased; moreover, the scheme is a data-independent asynchronous channel pruning method, which is beneficial to keeping the robustness of the compressed neural network, and can also realize pruning of different sparse granularities in different network layers of the neural network, thereby improving the compression flexibility.

Description

technical field [0001] The present invention relates to the technical field of deep neural network compression and acceleration, and more specifically, to a neural network pruning method, device, equipment and storage medium. Background technique [0002] As the depth and width of the neural network increase, it shows excellent performance in various AI (Artificial Intelligence, artificial intelligence) application scenarios, such as machine vision tasks such as image recognition and target detection. Faced with the development trend of machine vision software based on deep learning in various embedded devices or mobile devices, deep neural networks with huge parameters are difficult to deploy in devices with limited computing and storage resources. Deep neural network compression and acceleration technology provides a solution for the long-term real-time application of deep learning in these resource-constrained devices. The deep neural network compression technology achie...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/082G06N3/045
Inventor 尹文枫董刚赵雅倩
Owner LANGCHAO ELECTRONIC INFORMATION IND CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products