Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

CNN model compression method and device based on DS structure, and storage medium

A compression method and model technology, which is applied in the field of neural networks, can solve problems such as uneven compression effects and loss of model accuracy, and achieve the effects of reducing video memory occupation, reducing the number, and reducing the amount of floating-point operations

Pending Publication Date: 2020-06-05
PING AN TECH (SHENZHEN) CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] 1. The effect of compression is uneven;
[0006] 2. There is a serious loss of accuracy in the compressed model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • CNN model compression method and device based on DS structure, and storage medium
  • CNN model compression method and device based on DS structure, and storage medium
  • CNN model compression method and device based on DS structure, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0033] When the existing neural network is compressed, it usually converts a large and complex pre-training model into a streamlined small model to achieve the purpose of compression. However, this compression method not only has uneven compression effects, but also the compressed model There is a serious loss of precision.

[0034] The present invention is based on depthwise separable convolution (Depthwise-Pointwise) and lightweight attention structure (Squeeze-Excitation), using common convolution and BN (BatchNorm) operations in combination, and finally designs a compact CNN component-DS volume Blocks, the orderly stacking of the DS convolution blocks can form a model with fewer parameters but basically no loss of accuracy, thereby greatly reducing the memory usage and floating-point operations of the model, and f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of neural networks, and provides a CNN model compression method and device based on a DS structure, and a storage medium, wherein the method comprises thesteps: S110, enabling DW convolution and SE Module to form DS convolution blocks through common convolution and batch standardization BN operation, wherein the DS convolution blocks comprise a convolution Conv-1, a batch standardization BN, an activation function, a DW convolution, a batch standardization BN, an activation function, an SE Module, a convolution Conv-2 and a batch standardization BN; S120, stacking the DS convolution blocks in order to form a neural network structure; and S130, adding an input layer, a pooling layer, a full connection layer and a classification layer to the neural network structure to form a neural network model. According to the method, the DS convolution block structure is applied to the neural network, and the number of parameters of the neural network isgreatly reduced while the picture feature extraction capability is ensured.

Description

technical field [0001] The invention relates to the technical field of neural networks, in particular to a CNN model compression method, device and storage medium based on a DS structure. Background technique [0002] Although Convolutional Neural Network (CNN) has achieved outstanding results in fields such as computer vision and natural language processing, its number of parameters exceeding 100 million makes many practical applications (especially those based on embedded devices) prohibitive. Taking the classic VGG16 network as an example, the number of parameters has reached more than 130 million, occupying more than 500MB of storage space, and requires 30.9 billion floating-point operations to complete the recognition task of an image. Such a huge storage cost and computing overhead severely restrict the application of deep networks on small devices such as mobile terminals. [0003] In view of this, the compression of neural networks has gradually become a hot researc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04G06K9/00
CPCG06N3/082G06V40/168G06N3/045
Inventor 朱锦祥单以磊臧磊
Owner PING AN TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products