Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A flexible configurable neural network computing unit, computing array and construction method thereof

A computing unit and neural network technology, applied in the field of neural network hardware architecture, can solve the problems of not being able to make full use of the convolutional layer data reusability, and not being able to support different types of convolutional layer calculations.

Active Publication Date: 2021-02-19
XI AN JIAOTONG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the hardware implementation of most convolutional layer computing units can only complete one type of convolution method, cannot support the calculation of different types of convolutional layers in the network model, and cannot make full use of the data reusability of the convolutional layer

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A flexible configurable neural network computing unit, computing array and construction method thereof
  • A flexible configurable neural network computing unit, computing array and construction method thereof
  • A flexible configurable neural network computing unit, computing array and construction method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] Below in conjunction with accompanying drawing, the present invention is further described in detail,

[0059] see figure 1 As shown, a flexible configurable neural network computing unit of the present invention includes: a configurable storage module, a configurable control module, and a time-division multiplexable multiply-add computing module; the configurable storage module includes: a feature map data cache buffer, Step data cache buffer and weight data cache buffer; configurable control module includes: counter module and state machine module; multiplication and addition calculation module includes: multiplier and accumulator.

[0060] The feature map data cache buffer is used to store part of the feature map data used in the convolution calculation, and to recycle the feature map data with data sharing. The maximum length of the buffer is L1, and the size is max{K 1 A 1 , K 2 A 2 ,...,K i A i}, where K is the size of the convolution kernel in the convoluti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a flexibly configurable neural network computing unit, a computing array and a construction method thereof. The neural network computing unit includes: a configurable storage module, a configurable control module, and a multiply-add computing module capable of time-division multiplexing; a configurable storage module The modules include: feature map data buffer, step size data buffer and weight data buffer; the configurable control module includes: counter module and state machine module; the multiplication and addition calculation module includes: multiplier and accumulator. The invention can support any type of convolution calculation, and supports parallel calculation of multi-size convolution kernels, fully explores the flexibility and data reusability of convolutional neural network calculation units, greatly reduces system power consumption caused by data migration, and improves The computational efficiency of the system.

Description

technical field [0001] The invention belongs to the field of neural network hardware architecture, and in particular relates to a flexibly configurable neural network calculation unit, a calculation array and a construction method thereof. Background technique [0002] Flexible hardware computing architecture has an important impact on the hardware implementation of convolutional neural networks. As the most important structure in the convolutional neural network, the convolutional layer has the characteristics of large amount of calculation and strong data reusability. The convolutional layer shares this feature through weights, which reduces the complexity of the network model, greatly reduces the number of parameters, and avoids the complex feature extraction and data reconstruction processes in traditional recognition algorithms. [0003] In the convolutional neural network, the main function of the convolutional layer is to convolve the same set of input feature map da...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/063G06N3/04
CPCG06N3/063G06N3/045
Inventor 任鹏举樊珑赵博然宗鹏陈陈飞郑南宁
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products