Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Precision dynamic adaptive accumulation module for bit width incremental addition tree

A dynamic adaptive, additive tree technology, applied in the field of calculation and calculation, to achieve the effect of low power consumption and reduced power consumption

Pending Publication Date: 2021-01-12
NANJING PROCHIP ELECTRONIC TECH CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It can maintain a high accuracy rate and solve the technical problem that various application systems of the existing convolutional neural network are difficult to implement in mobile terminals and portable devices

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Precision dynamic adaptive accumulation module for bit width incremental addition tree
  • Precision dynamic adaptive accumulation module for bit width incremental addition tree
  • Precision dynamic adaptive accumulation module for bit width incremental addition tree

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] Example 1. The present invention proposes a precision dynamic self-adaptive accumulation module for bit-width increment tree, and its overall structure is as follows: figure 1 shown. The precision dynamic adaptive accumulation module includes: data pre-analysis sub-module, calculation precision dynamic configuration sub-module, and tree-shaped accumulation sub-module with increasing bit width; among them, the tree-shaped accumulation sub-module with increasing bit width adopts an N-layer addition tree structure, in which Each layer contains M multi-mode precision configurable addition units, and satisfies M=2 N-1 , where N is the level of the additive tree.

[0033] The input feature vector of the weight binarized convolutional neural network is input into the data pre-analysis sub-module, and the data pre-analysis sub-module provides the approximate bit width obtained from the pre-analysis to the calculation precision dynamic configuration sub-module, and the calcula...

Embodiment 2

[0036] Example 2. In this preferred embodiment, the tree-shaped accumulation sub-module with increasing bit width adopts a 7-layer addition tree structure, and its structure is shown as figure 2 shown. The specific parameters and data scale of the weight binarized convolutional neural network also match the tree-shaped accumulation sub-module of the 7-layer additive tree structure with increasing bit width.

[0037] from figure 2 It can be seen that the first layer of the tree accumulation submodule with increasing bit width contains 2 0 = 1 multi-mode precision configurable addition unit whose hardware number is 1#1; the i-th layer contains 2 i-1 A multi-mode precision configurable addition unit, whose hardware numbers are i#1, i#2, ... i#2 i-1 -1, i#2 i-1 , and so on, the seventh layer contains 2 6 = 64 multi-mode precision configurable adding units, the hardware numbers of which are 7#1, 7#2, ... 7#63, 7#64.

[0038] Therefore, in the calculation process, no more t...

Embodiment 3

[0042] Example 3. The approximate number of digits of the data pre-analysis sub-module is determined by the specific distribution of the input data, and the selection of the approximate number of digits at this level is determined by judging whether the sum of the decimal digits of the input data is greater than 1, that is, at 2 -n Bits are counted, if the last count exceeds 2 n , then the final sum is greater than 2 -n ·2 1n = 1

[0043] The approximate number of digits of the data pre-analysis sub-module is determined by the specific distribution of the input data. The specific implementation method is to use a counter to count the number of 1 bits for each bit, and set the counter threshold to determine the bit width of the precise calculation component and the approximate calculation component. size. In scenarios with high computing accuracy requirements, reduce the number of approximate computing components, and in scenarios with low computing accuracy requirements, i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a precision dynamic adaptive accumulation module for a bit width incremental addition tree. The precision dynamic adaptive accumulation module comprises a data pre-analysis sub-module, a calculation precision dynamic configuration sub-module and a bit width incremental tree accumulation sub-module, the bit width incremental tree accumulation sub-module uses an addition treestructure, and each layer comprises a plurality of multi-mode precision configurable addition units. The input feature vector of a neural network is input into the data pre-analysis sub-module, and the decimal approximate bit width is determined based on the requirement of the calculation scene for the calculation precision; and the calculation precision dynamic configuration sub-module configures the multi-mode precision configurable addition units in the bit width incremental tree accumulation sub-module, so that an optimal calculation mode is selected. The accuracy of the neural network isnot influenced by approximate calculation of an approximate adder, the requirement of deploying the neural network system on a mobile terminal and portable equipment is met, and the task is completedwith low power and high accuracy.

Description

technical field [0001] The invention belongs to the technical field of calculation and calculation, and in particular relates to a precision dynamic self-adaptive accumulation module for bit-width increment tree. Background technique [0002] Since there are a large number of calculations for accumulating operands in the weight binarization network system, this will cause large power consumption and errors. At present, a lot of research has been carried out from the algorithm, software, hardware, circuit and transistor levels. However, low energy consumption faces new challenges for emerging applications such as digital signal processing, visual computers, and machine learning, which have higher computational requirements for weight binarization networks. [0003] Weight binarization networks are somewhat tolerant to limited or unimportant errors. Errors create bias Error tolerance arises for a number of reasons, including perception of imperfection in the human sense, nois...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F7/509G06N3/063
CPCG06F7/509G06N3/063
Inventor 王镇
Owner NANJING PROCHIP ELECTRONIC TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products