Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolutional neural network hardware accelerator based on Winograd algorithm and calculation method

A convolutional neural network and hardware accelerator technology, applied in the field of deep convolutional neural network computing, can solve problems such as increasing the access time of intermediate results, occupying large hardware resources, and reducing computing speed, so as to reduce repeated transformations and improve parallelism , the effect of increasing parallelism

Active Publication Date: 2021-08-13
HEFEI UNIV OF TECH
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The Winograd algorithm contains continuous matrix multiplication, which will increase the access time of intermediate results and reduce the calculation speed. At the same time, the Winograd algorithm needs a long calculation process to obtain the calculation results, and will occupy a lot of hardware resources.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network hardware accelerator based on Winograd algorithm and calculation method
  • Convolutional neural network hardware accelerator based on Winograd algorithm and calculation method
  • Convolutional neural network hardware accelerator based on Winograd algorithm and calculation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] In this embodiment, a kind of convolutional neural network hardware accelerator based on Winograd algorithm, such as figure 1 As shown, including: storage layer, computing layer, control layer, data distributor, input buffer, output buffer;

[0067] The storage layer includes: off-chip DDR memory and on-chip storage;

[0068] The control layer includes: configuration module and control module;

[0069] The computing layer includes: multi-channel PE array and post-processing module;

[0070] The post-processing module includes: activation function module and convolution channel accumulation module;

[0071] After the DDR memory receives the externally sent convolution kernel and input feature map and completes the storage, it triggers the control module, so that the configuration module reads calculation instructions from its own RAM under the control of the control module to obtain calculation tasks and translate them. code into configuration information;

[0072] T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a Winograd algorithm-based convolutional neural network hardware accelerator and a calculation method. The accelerator comprises an input buffer module, a PE array, a post-processing module, a data distributor, a control module, a configuration module, an output buffer module and a storage layer; the configuration module decodes the instruction into configuration information; the control module is used for controlling the PE array, the data distributor and other modules to complete different calculations; the data distributor reads the data from the storage layer by adopting different address mapping modes and data distribution modes, and distributes the data to the input buffer; the input buffer module sends the buffered calculation data to the PE array for calculation; the PE array performs calculation path reconstruction according to the configuration information and calculates the data; and the post-processing module performs operations such as multi-channel accumulation and activation function processing on a calculation result. According to the method, the convolution calculation speed can be increased, the data migration loss is reduced, and the performance of the whole accelerator is improved.

Description

technical field [0001] The invention relates to the field of deep convolutional neural network calculations, in particular to a method and device for parallel accelerated calculation of convolutional pipelines based on the Winograd algorithm. Background technique [0002] The increasing complexity of the current neural network structure has dramatically increased the amount of calculation and data transfer of the neural network, while the serial calculation characteristics of the CPU and the small number of cores make its calculation efficiency low, while the GPU energy consumption is relatively low, so It is necessary to design efficient hardware neural network accelerators to improve the computational efficiency of neural networks. [0003] Convolution calculation is the basic calculation of neural network. Traditionally, it is generally performed by sliding window convolution. This calculation method brings a huge amount of calculation and data transfer to the entire neur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/063
CPCG06N3/063G06N3/045
Inventor 倪伟袁子昂冉敬楠宋宇鲲张多利
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products