Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Structured pruning method and device for lightweight neural network, medium and equipment

A neural network and structured technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve the problems of large pruning granularity, too much pruning, too little pruning, etc., to achieve less storage space, reduce Video memory space, the effect of reducing network redundancy

Inactive Publication Date: 2021-01-19
广州云从凯风科技有限公司
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Structured pruning methods (such as figure 1 Shown) is performed on the Kernel level (2-dimensional tensor) or Filter level (3-dimensional tensor) of the neural network model, because the pruning granularity is large, which may lead to wrong pruning, too much pruning or too little Pruning and other phenomena
[0004] The existing structured pruning methods basically use the importance criterion to discover the redundancy of the neural network, and the importance criterion of the structural parameters is actually difficult to obtain because of the internal unexplainable black box characteristics of the neural network. greatly affects the pruning performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Structured pruning method and device for lightweight neural network, medium and equipment
  • Structured pruning method and device for lightweight neural network, medium and equipment
  • Structured pruning method and device for lightweight neural network, medium and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] Embodiments of the present invention are described below through specific examples, and those skilled in the art can easily understand other advantages and effects of the present invention from the content disclosed in this specification. The present invention can also be implemented or applied through other different specific implementation modes, and various modifications or changes can be made to the details in this specification based on different viewpoints and applications without departing from the spirit of the present invention. It should be noted that, in the case of no conflict, the following embodiments and features in the embodiments can be combined with each other.

[0075] It should be noted that the diagrams provided in the following embodiments are only schematically illustrating the basic ideas of the present invention, and only the components related to the present invention are shown in the diagrams rather than the number, shape and shape of the compo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a structured pruning method for a lightweight neural network. The method comprises the following steps: constructing a similarity matrix according to a three-dimensional tensorFilter in a to-be-pruned target layer in the neural network; clustering the three-dimensional tensor Filter corresponding to the similarity matrix based on spectral clustering to obtain a plurality of clustering clusters; determining the centroid of each clustering cluster and the distance between each three-dimensional tensor Filter in the same clustering cluster and the centroid; and deleting the three-dimensional tensor Filter corresponding to the centroid distance exceeding a set threshold to obtain a target neural network model. As a structured pruning method, the weight matrixes of theneural network subjected to structured pruning do not have an unstructured sparse phenomenon, existing software and hardware can be directly used for acceleration, and the invention can be very naturally compared with other lightweight neural network technologies. Knowledge distillation, weight quantification and the like are combined for use to further reduce the network redundancy.

Description

technical field [0001] The invention relates to the technical field of network compression, in particular to a structured pruning method, device, medium and equipment for a lightweight neural network. Background technique [0002] As the technical carrier of deep learning, the neural network model has greatly refreshed the performance accuracy of traditional signal processing methods in many fields. As deep learning tasks become more and more complex, the number of layers of the neural network model is gradually increasing, and the width is also increasing. Neural network models with increasingly complex topological structures may be able to run efficiently on servers with expensive Graphics Processing Units (GPUs); but because of their huge parameter complexity and computational complexity, it is difficult Stable operation on mobile terminal equipment. However, with the advent of the mobile intelligent society, mobile terminal applications such as mobile phones, wearable ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06K9/62
CPCG06N3/082G06F18/23
Inventor 姚志强周曦李连强梁俊文
Owner 广州云从凯风科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products