Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolutional neural network quantification method for automatically sensing weight distribution in reinforcement learning

A convolutional neural network and reinforcement learning technology, applied in the field of convolutional neural network quantization of automatic perception weight distribution of reinforcement learning, can solve problems such as loss of precision

Pending Publication Date: 2021-04-30
XI AN JIAOTONG UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the problem of large precision loss caused by traditional model quantization in the prior art and the dependence on calibration data sets and training data sets in the quantization process, the present invention proposes a convolutional neural network that automatically perceives weight distribution through reinforcement learning Quantization method, the present invention can be used to compress and quantize models when deploying convolutional neural networks on various hardware platforms

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network quantification method for automatically sensing weight distribution in reinforcement learning
  • Convolutional neural network quantification method for automatically sensing weight distribution in reinforcement learning
  • Convolutional neural network quantification method for automatically sensing weight distribution in reinforcement learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The following will refer to the attached Figure 1 to Figure 3 Specific examples of the present invention are described in more detail. Although specific embodiments of the invention are shown in the drawings, it should be understood that the invention may be embodied in various forms and is not limited to the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present invention and to fully convey the scope of the present invention to those skilled in the art.

[0029] It should be noted that certain terms are used in the specification and claims to refer to specific components. Those skilled in the art should understand that they may use different terms to refer to the same component. The specification and claims do not use differences in nouns as a way of distinguishing components, but use differences in functions of components as a criterion for distinguishing. "Includes" or "comprises" mentioned throughout ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a convolutional neural network quantification method for automatically sensing weight distribution through reinforcement learning. The method comprises the steps: carrying out the fusion of a parameter of batch processing operation of each layer and the weight of convolution operation, obtaining a fused weight and offset, and obtaining the distribution information of the fused weight of each layer in a floating point convolutional neural network model; automatically searching an optimal weight scaling coefficient of each layer according to the distribution information reinforcement learning of the weight of each layer, and weighting the floating point weight into INT8 type data based on the weight scaling coefficient of each layer; inputting a calibration data set, recording each layer of output feature map when each group of data is input, selecting a mode as a scaling coefficient of each layer of output feature map, and calculating to obtain a scaling coefficient of each layer of bias according to the scaling coefficient of each layer of weight and the scaling coefficient of each layer of output feature map so as to quantize the bias of the floating point into the bias of INT32 type; and constructing a forward reasoning process based on the INT8 type data, the INT32 type bias and the total scaling coefficient to complete quantification.

Description

technical field [0001] The invention belongs to the technical field of artificial intelligence, in particular to a convolutional neural network quantification method for automatic perception of weight distribution through reinforcement learning. Background technique [0002] In recent years, with the development of artificial intelligence technology dominated by convolutional neural networks, more and more computer vision tasks have been well solved, such as image classification, object detection and semantic segmentation. And a current development trend is to deploy high-performance neural network models on end-side platforms and run them in real time (greater than 30 frames) in real scenarios, such as mobile / embedded devices. These platforms are characterized by low memory resources, low processor performance, and limited power consumption. This makes it impossible for the current models with the highest accuracy to be deployed on them and meet real-time requirements due t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/063G06N3/08G06N5/04
CPCG06N3/084G06N5/04G06N3/063G06N3/045G06F18/214
Inventor 任鹏举涂志俊马建夏天赵文哲陈飞郑南宁
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products