Method for quantizing PRELU activation function
A technology of activation function and data, applied in the field of neural network acceleration, to achieve the effect of reducing inference time
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Example Embodiment
[0042] In order to understand the technical content and advantages of the present invention more clearly, the present invention will now be further described in detail with reference to the accompanying drawings.
[0043] like figure 1 As shown, a quantized activation function of the present invention is a method for PRELU, and the method includes the following steps:
[0044] S1, data quantization, quantize the data to be quantized according to the following formula (1) to obtain low-bit data,
[0045] Formula 1)
[0046] Variable Description: W f for full precision data is an array, W q is the quantized data, max w is the full precision data W f medium maximum, min w is the full precision data W f The minimum value, b is the bit width after quantization;
[0047] S2, quantize the PRELU activation function, and the quantization formula is shown in formula (2):
[0048] Formula (2) Variable Description: When x i When the value is greater than 0, you need to set x...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap