Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image recognition method based on weight pruning quantization

An image recognition and weight technology, applied in neural learning methods, neural architectures, biological neural network models, etc., can solve the problems of large volume of convolutional neural networks, limited deployment and use, poor application experience on the terminal, etc. The effect of low complexity, shortened reasoning time, and simple structure

Pending Publication Date: 2022-02-11
MEISHAN POWER SUPPLY CO STATE GRID SICHUAN ELECTRIC POWER CO
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the shortcomings of convolutional neural network, such as large volume, many parameters, and large amount of calculation, its deployment and use on mobile devices or low-power devices are limited.
At the same time, the complex convolutional neural network occupies more memory bandwidth and increases more energy consumption during reasoning, making the application experience on the end poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image recognition method based on weight pruning quantization
  • Image recognition method based on weight pruning quantization
  • Image recognition method based on weight pruning quantization

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0038] see figure 1 As shown, the present invention provides a kind of image recognition method based on weight pruning quantization, and its steps include:

[0039] S1. For the first time, remove the weights with smaller absolute values ​​in some convolutional neural network models, so that the network is sparse for the first time, and repeatedly train the convolutional neural network model until the first model converges;

[0040] S2. Remove the weights with sma...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an image recognition method based on weight pruning and quantization, and relates to the field of computer vision. The method comprises the following steps: removing a part with a smaller absolute value in a convolutional neural network model of a part of the part, so that the network is sparse; retraining the convolutional neural network model until the model converges, and repeating the above two steps; pruning 10% of the parameters each time, and retraining the pruned convolutional neural network model until 80% of the parameters are cut off; quantizing the weight of the 50% convolutional neural network model by using an 8-bit fixed point, retraining the convolutional neural network model until the model converges, not updating the quantized weight, updating the non-quantized weight, and enabling that the cut weight also participates in updating, and the important pruned weight is reactivated; and repeating the previous steps, and continuously quantizing 10% of parameters. Redundant neurons or weights are cut off, so that the parameters of the model can be effectively reduced, and the calculation amount and the storage space of the model are reduced.

Description

technical field [0001] The invention belongs to the technical field of computer vision and artificial intelligence processing, in particular to an image recognition method based on weight pruning and quantization. Background technique [0002] With the continuous development and upgrading of computer hardware, especially the rapid development of image processors in parallel computing, researchers have obtained much higher computing power than before for the study of convolutional neural networks. Due to the shortcomings of convolutional neural network, such as large volume, many parameters, and large amount of calculation, its deployment and use on mobile devices or low-power devices are limited. At the same time, the complex convolutional neural network occupies more memory bandwidth and increases more energy consumption during inference, making the application experience on the terminal poor. Contents of the invention [0003] The purpose of the present invention is to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/082G06N3/045
Inventor 王茗禾刘垚宏曹刚杨琳徐彤喻婷陈亮杨斯旭
Owner MEISHAN POWER SUPPLY CO STATE GRID SICHUAN ELECTRIC POWER CO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products