Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep convolutional neural network quantization method, system and device, and storage medium

A neural network and deep convolution technology, applied in the field of deep learning, can solve the problems of not supporting channel-by-channel quantization, low quantization accuracy, and low efficiency in processing deep convolution neural networks, so as to improve the accuracy of quantitative reasoning and improve computing efficiency. Effect

Pending Publication Date: 2022-04-29
山东云海国创云计算装备产业创新中心有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Most of the existing quantization technology solutions only support layer-by-layer quantization of deep convolutional neural networks, but do not support quantization by channel. They cannot meet the quantization of network models such as MobileNet, and the quantization accuracy is low.
Most of the existing quantization technology solutions are quantitative inferences performed on traditional general-purpose processors such as the Central Processing Unit (CPU). However, the CPU has rich control logic and only a small number of arithmetic units. very low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep convolutional neural network quantization method, system and device, and storage medium
  • Deep convolutional neural network quantization method, system and device, and storage medium
  • Deep convolutional neural network quantization method, system and device, and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to make the object, technical solution and advantages of the present invention clearer, the embodiments of the present invention will be further described in detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0022] It should be noted that all expressions using "first" and "second" in the embodiments of the present invention are to distinguish two entities with the same name but different parameters or parameters that are not the same, see "first" and "second" It is only for the convenience of expression, and should not be construed as a limitation on the embodiments of the present invention, which will not be described one by one in the subsequent embodiments.

[0023] In the first aspect of the embodiments of the present invention, an embodiment of a method for quantizing a deep convolutional neural network is proposed. figure 1 What is shown is a schematic diagram of an embodiment of a method for quantizi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a deep convolutional neural network quantization method, system and device, and a storage medium, and the method comprises the steps: analyzing a deep learning neural network model, and inserting a pseudo-quantization operator at an operator needing to be quantized; floating point type parameter reasoning is carried out on the constructed test set, and a feature quantization scaling factor is calculated according to the relative entropy divergence; quantizing weight parameters of a convolutional layer or a full connection layer of the deep learning neural network model to obtain a weight quantization scaling factor; and performing quantization reasoning according to the feature quantization scaling factor and the weight quantization scaling factor. According to the method, random rounding is applied to deep neural network model quantization, meanwhile, a channel quantization algorithm is optimized, quantization reasoning is carried out on designed special equipment, and the quantization reasoning precision and efficiency of the deep neural network model are improved.

Description

technical field [0001] The present invention relates to the field of deep learning, and more specifically refers to a method, system, device and storage medium for quantifying deep convolutional neural networks. Background technique [0002] In recent years, with the rapid development of deep learning, in order to improve the inspection accuracy of various AI application scenarios, the size and number of layers of deep neural network structures and the number of various parameters have continued to increase, resulting in greater space requirements for deep learning models. Lower reasoning efficiency. As one of the general deep learning optimization methods, the model is quantized as a deep convolutional neural network and quantized as a fixed-point model, which has smaller storage space and faster reasoning speed, and ensures that the accuracy is within a certain loss range. It is suitable for most Most models and usage scenarios. Model quantization refers to the linear ma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 贾敬崧
Owner 山东云海国创云计算装备产业创新中心有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products