Approximate-computation-based binary weight convolution neural network hardware accelerator calculating module

A binary weight convolution and hardware accelerator technology, applied in biological neural network models, physical implementation, etc., can solve problems such as limited power consumption, and achieve the effects of accelerated computing speed, small area, and low power consumption

Active Publication Date: 2017-06-30
南京风兴科技有限公司
View PDF6 Cites 64 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention aims to solve the technical problem of binary weight convolutional neural network applied to

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Approximate-computation-based binary weight convolution neural network hardware accelerator calculating module
  • Approximate-computation-based binary weight convolution neural network hardware accelerator calculating module
  • Approximate-computation-based binary weight convolution neural network hardware accelerator calculating module

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Embodiments of the invention are described in detail below, examples of which are illustrated in the accompanying drawings. Where the same name is used throughout to refer to modules with the same or similar functionality. The implementation example described below with reference to the accompanying drawings takes a convolution kernel size of 3×3 as an example, and the number of parallel input channels is set to 4, which is intended to explain the present invention, but should not be construed as a limitation of the present invention.

[0035] In addition, the terms "first" and "second" are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implying the quantity of indicated technical features. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of these features. In the description of the present invention, "plurality" means two or more, unless otherwise specifically...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an approximate-computation-based binary weight convolution neural network hardware accelerator calculating module. The hardware accelerator calculating module is able to receive the input neural element data and binary convolution kernel data and conducts rapid convolution data multiplying, accumulating and calculating. The calculation module utilizes the complement data representation, and includes mainly an optimized approximation binary multiplier, a compressor tree, an innovative approximation adder, and a temporary register for the sum of the serially adding part. In addition, targeted to the optimized approximation binary multiplier, two error compensation schemes are proposed, which reduces or completely eliminates the errors brought about from the optimized approximation binary multiplier under the condition of only slightly increasing the hardware resource overhead expense. Through the optimized calculating units, the key paths for the binary weight convolution neural network hardware accelerator using the computation module are shortened considerably, and the size loss and power loss are also reduced, making the module suitable for a low power consuming embedded type system in need of using the convolution neural network.

Description

technical field [0001] The present invention designs the field of computer and electronic information technology, and in particular relates to a calculation module of a binary weight convolutional neural network hardware accelerator based on approximate calculation. Background technique [0002] The deep convolutional neural network model has made great breakthroughs and successes in many fields such as image classification, action detection, speech recognition and other big data analysis tasks. On the one hand, as the effect of the convolutional neural network becomes better and better, its topology structure is also deepening, and the number of parameters has reached 10 to the 6th power and above, which brings extreme computational complexity. With a big boost, the required computing power has exploded. On the other hand, embedded systems can only provide limited resources, and their power consumption is also limited within a certain range. Although the existing solutions...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N3/063
CPCG06N3/063
Inventor 王中风王逸致林军周杨灿
Owner 南京风兴科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products