Method for quantizing PRELU activation function

A technology of activation function and data, applied in the field of neural network acceleration, to achieve the effect of reducing inference time

Pending Publication Date: 2021-12-07
合肥君正科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] In order to solve the above technical problems, this application proposes a method of quantizing the activation function as PRELU, which aims to overcome the defects in the above-mentioned prior ar

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for quantizing PRELU activation function
  • Method for quantizing PRELU activation function
  • Method for quantizing PRELU activation function

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0042] In order to understand the technical content and advantages of the present invention more clearly, the present invention will now be further described in detail with reference to the accompanying drawings.

[0043] like figure 1 As shown, a quantized activation function of the present invention is a method for PRELU, and the method includes the following steps:

[0044] S1, data quantization, quantize the data to be quantized according to the following formula (1) to obtain low-bit data,

[0045] Formula 1)

[0046] Variable Description: W f for full precision data is an array, W q is the quantized data, max w is the full precision data W f medium maximum, min w is the full precision data W f The minimum value, b is the bit width after quantization;

[0047] S2, quantize the PRELU activation function, and the quantization formula is shown in formula (2):

[0048] Formula (2) Variable Description: When x i When the value is greater than 0, you need to set x...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method for quantizing an activation function as PRELU, which comprises the following steps: S1, data quantization: to-be-quantized data is quantized according to the following formula (1) to obtain low-bit data, and the variable description of the formula (1) is as follows: Wf is an array of full-precision data, Wq is quantized data, maxw is the maximum value in the full-precision data Wf, minw is the minimum value in the full-precision data Wf, b is the quantized bit width; S2, the PRELU activation function is quantized, a quantization formula is shown as a formula (2), and variables of the formula (2) indicate that when the value of xi is larger than 0, the value of xi needs to be multiplied by a parameter q1, if the value of xi is smaller than 0, the value of xi needs to be multiplied by a parameter ac, and c is a channel where xi is located; specific parameters illustrate that x is a three-dimensional array, namely {h, w, c}, and h, w and c are respectively the length, width and channel number of the array; the parameter a is a one-dimensional array {c}, and the values of c and c in x are equal; q1 is the quantization of 1.0; ac is the value of the cth channel in the parameter a.

Description

technical field [0001] The present invention relates to the technical field of neural network acceleration, in particular to a method for quantifying PRELU activation functions. Background technique [0002] In recent years, with the rapid development of science and technology, the era of big data has arrived. Deep learning uses deep neural network (DNN) as a model, and has achieved remarkable results in many key areas of artificial intelligence, such as image recognition, reinforcement learning, and semantic analysis. As a typical DNN structure, convolutional neural network (CNN) can effectively extract hidden layer features of images and accurately classify images. It has been widely used in the field of image recognition and detection in recent years. [0003] In particular, real-time quantization of the feature map: dequantize the result of the convolution operation into a full-precision number, and then complete the quantization of the feature map according to the maxi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/048G06N3/045
Inventor 张东于康龙
Owner 合肥君正科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products