Efficient deep convolutional neural network pruning method

A convolutional neural network and deep convolution technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve the problems of large impact on neural network performance, low neural network compression rate, and inaccurate positioning. Achieve the effect of high pruning efficiency, reducing accuracy loss and reducing accuracy loss

Pending Publication Date: 2021-11-05
GUANGDONG ARTIFICIAL INTELLIGENCE & DIGITAL ECONOMY LAB (GUANGZHOU) +1
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this structured pruning method is inaccurate for the positioning of redundant components in the neural network, the compression rate of the neural network is low, and it has a great impact on the performance of the neural network.
[0006] In addition, most of the pruned neural network models using unstructured methods and structured methods need to be trained again, which consumes a lot of time and is inefficient

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Efficient deep convolutional neural network pruning method
  • Efficient deep convolutional neural network pruning method
  • Efficient deep convolutional neural network pruning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The embodiments and effects of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0035] refer to figure 1 , the implementation steps of this example are as follows:

[0036] In step 1, a deep convolutional neural network is trained using the ADMM-based sparse learning method.

[0037] The deep convolutional neural network is an existing neural network comprising N layers of convolutional layers, wherein the input of the lth layer is represented as x l , for each layer input x l Perform convolution and normalization operations, the operation set is represented as f( ), and the output of each channel of the deep convolutional neural network is represented as:

[0038] the y l,i =f(x l ,w l,i ,b l,i ),l=1,2,...N; i=1,2,...,n

[0039] Among them, l is the index of the number of neural network layers, i is the index of the channel, w l.i and b l,i are the weight and bias sets of channels respectively, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an efficient deep convolutional neural network pruning method, which mainly solves the problem that the existing deep convolutional neural network consumes a large amount of storage resources and computing resources, and the implementation scheme is as follows: a scaling factor is optimized through a sparse learning method based on an ADMM algorithm, the deep convolutional neural network is trained, and the network structure is sparse; a genetic algorithm is used to search the tailoring rate suitable for each layer of the trained deep convolutional neural network, and the optimal tailoring rate meeting the requirement is automatically searched under the guidance of a fitness function; and each layer of the network after sparse learning training is cut by using the optimal cutting rate to obtain the convolutional neural network with the optimal efficiency. According to the method, the precision loss of the pruned convolutional neural network can be greatly reduced, the consumption of storage resources and computing resources by the convolutional neural network is greatly reduced by reducing the parameter quantity of the network, and the method can be used for compression of the deep convolutional neural network.

Description

technical field [0001] The invention belongs to the technical field of computers, and mainly relates to an efficient pruning method of a deep convolutional neural network, which can be used for compression of the deep convolutional neural network. Background technique [0002] In recent years, neural network technology has achieved good results in scientific research and practical applications. However, compared with traditional algorithms, the calculation process of neural network needs to consume a lot of storage resources and computing resources, which makes it more expensive to deploy applications. High power consumption and cost limit the use of neural networks on mobile devices with limited power consumption. As a method of compressing the neural network, neural network pruning reduces the storage consumption and computing consumption of the neural network by removing redundant components in the neural network, so as to achieve the purpose of reducing the power consump...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04G06N3/12
CPCG06N3/082G06N3/086G06N3/04G06N3/126
Inventor 谢雪梅石光明杨建朋汪振宇
Owner GUANGDONG ARTIFICIAL INTELLIGENCE & DIGITAL ECONOMY LAB (GUANGZHOU)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products