Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fixed-point quantitative convolutional neural network accelerator calculation circuit

A convolutional neural network, computing circuit technology, applied in the field of integrated circuits

Pending Publication Date: 2020-10-27
UNIV OF ELECTRONIC SCI & TECH OF CHINA
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although fixed-point quantization has reduced the complexity of convolution calculations, for deep neural networks, there are still a lot of calculations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fixed-point quantitative convolutional neural network accelerator calculation circuit
  • Fixed-point quantitative convolutional neural network accelerator calculation circuit
  • Fixed-point quantitative convolutional neural network accelerator calculation circuit

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The technical solution of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0035] The present invention proposes a convolutional neural network accelerator computing circuit based on fixed-point quantization. The application of convolutional neural network with INT8 integer quantization and the quantization specification of TensorFlow for quantization scheme selection are used as examples to illustrate, but the specific quantization types and quantization schemes should not be construed as a limitation of the invention. First, quantize the INT8 data type of the convolutional neural network. On this basis, the accelerator computing circuit proposed by the present invention is applied. After shaping and quantizing, not only the memory space occupied by the weight becomes 1 / 4 of the original, but also more can be transmitted under the same bandwidth. Multi-data, and the calculation process is ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fixed-point quantitative convolutional neural network accelerator calculation circuit, and belongs to the technical field of integrated circuits. The method comprises the following steps of: processing input data of N input channels by using N input channel processing units, correspondingly multiplying a plurality of input feature maps and a plurality of weights in the input data of each input channel, and quickly adding to obtain convolution results of the N input channels; a partial accumulation unit accumulates all convolution results of the N input channels and outputs the convolution results to a quantization activation unit; and the quantization activation unit sequentially performs bias accumulation, approximate multiplier multiplication, right shift, function activation, zero point data addition and output digit amplitude limiting to obtain an output result of the convolutional neural network accelerator calculation circuit. According to the method, the calculation speed of the fixed-point quantized convolutional neural network is increased on the premise of not generating obvious precision loss, and the method has the characteristics of low powerconsumption and relatively small circuit area and is suitable for a convolutional neural network system needing to use shaping quantization.

Description

technical field [0001] The invention belongs to the field of integrated circuits and relates to a computing circuit of a fixed-point quantized convolutional neural network accelerator. Background technique [0002] Convolutional Neural Networks (CNNs) have achieved great success in the field of image recognition due to their excellent predictive performance. Nonetheless, modern CNNs with high inference accuracy usually have large model sizes and high computational complexity, which complicates their deployment in data centers or edge devices, especially for applications that require low resource consumption or low response latency. application scenarios. To facilitate the application of complex CNNs, an emerging field of model compression research focuses on reducing the model size and execution time of CNNs with minimal loss of accuracy. [0003] The accuracy of network parameters can be quantized as low as 1 bit, and XNOR-net and related network variants can achieve a 32...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06N3/04
CPCG06N3/063G06N3/048G06N3/045
Inventor 贺雅娟周航蔡卢麟朱飞宇候博文张波
Owner UNIV OF ELECTRONIC SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products