Unlock instant, AI-driven research and patent intelligence for your innovation.

Neural network quantization compression method and system based on run-length all-zero coding

A neural network, quantization compression technology, applied in the field of neural network computing, can solve the problem of increasing the average code length of Huffman coding, etc., to achieve the effect of increasing input randomness, balancing coding rate, and balancing compression speed

Active Publication Date: 2022-07-01
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0016] (1) Compared with the original normal distribution, the frequency of occurrence of each number after retraining is more even, which may lead to an increase in the average code length of Huffman coding. Therefore, the data must be pre-coded before entropy coding. Improve the Huffman encoding of the frequency of input characters

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network quantization compression method and system based on run-length all-zero coding
  • Neural network quantization compression method and system based on run-length all-zero coding
  • Neural network quantization compression method and system based on run-length all-zero coding

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] In order to make the above-mentioned features and effects of the present invention more clearly and comprehensible, the following specific embodiments are given and described in detail in conjunction with the accompanying drawings in the description as follows.

[0068] In order to optimize the neural network compression method, this paper analyzes the distribution characteristics of the data in the neural network model after pruning and quantization, and proposes a lossless compression algorithm that combines entropy coding, run-length coding and all-zero coding. The hardware deployment form has been fully explored, and finally the NNcodec neural network encoding and decoding simulator is designed and implemented. The optimization effect of the hybrid coding on the neural network compression method is proved, and an easy-to-implement hardware design scheme is also given.

[0069] In order to make the above-mentioned features and effects of the present invention more cl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a run-length all-zero coding neural network quantization compression method and system, and the method comprises the steps: carrying out the run-length coding of zero data in neural network data, and obtaining first intermediate data; replacing a coding segment of which the run length is 3 of the first intermediate data with a ZeroLiteral character to obtain second intermediate data; judging whether a character, which is the same as the ZeroLital character, in the second intermediate data is an original character in the neural network data or not, if yes, replacing the character, which is the same as the ZeroLital character, in the second intermediate data with the ZeroExtra character, and meanwhile, adding a flag bit representing that the character is the original character behind the ZeroExtra character, otherwise, replacing the character, which is the same as the ZeroLital character, in the second intermediate data with the ZeroExtra character, and adding a flag bit representing that the character is the original character behind the ZeroExtra character; and meanwhile, a flag bit representing that the character is a replacement character is added behind the character. According to the method, run-length all-zero coding is provided, the neural network data can be efficiently compressed in a lossless mode, the run-length all-zero coding comprises second-order character replacement, the number of 0 appearing in the data is reduced, and more compression space is reserved for subsequent Huffman coding.

Description

technical field [0001] The invention relates to the field of neural network operations, and in particular to a method and system for quantizing and compressing neural networks based on run-length all-zero coding. Background technique [0002] In recent years, artificial intelligence has developed rapidly in the context of the double explosion of information volume and hardware computing power, and has increasingly become the main driving force for productivity development and technological innovation. As the main branch of artificial intelligence technology, in order to further improve the accuracy of the model, the neural network algorithm has encountered technical bottlenecks with complex structure, huge amount of parameters and large amount of calculation, which limit the application of neural network models in the pursuit of throughput rate and energy efficiency ratio. Therefore, computational efficiency becomes the main research goal in the next stage. The most effecti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N19/42H04N19/13H04N19/124H04N19/122G06N3/08
CPCH04N19/42H04N19/13H04N19/124H04N19/122G06N3/082Y02D10/00
Inventor 何皓源王秉睿支天郭崎
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI