Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolutional neural network inference accelerator and method

A convolutional neural network and accelerator technology, which is applied in the field of dedicated acceleration architecture, can solve the problems that the real performance cannot be considered and cannot be applied and developed, and achieve the effect of improving the efficiency of addition, improving the efficiency of loading, and accelerating the convolutional neural network

Active Publication Date: 2018-06-19
上海岳芯电子科技有限公司
View PDF15 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Song L et al. "A Pipelined ReRAM-Based Accelerator for Deep Learning" (IEEE International Symposium on High PERFORMANCE Computer Architecture.IEEE, 2017:541-552.) also used the characteristics of memristors to realize the convolutional neural network. Backpropagation and backpropagation provide future generations with a new idea of ​​accelerator design, but there is a problem with the use of new materials for accelerator design, that is, the real performance of new materials cannot be considered because they have not yet been put into the market. Temporarily unable to be applied in actual development

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network inference accelerator and method
  • Convolutional neural network inference accelerator and method
  • Convolutional neural network inference accelerator and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035]The implementation of the present invention is described below through specific examples and in conjunction with the accompanying drawings, and those skilled in the art can easily understand other advantages and effects of the present invention from the content disclosed in this specification. The present invention can also be implemented or applied through other different specific examples, and various modifications and changes can be made to the details in this specification based on different viewpoints and applications without departing from the spirit of the present invention.

[0036] figure 1 It is a schematic diagram of an architecture of an embodiment of a convolutional neural network inference accelerator of the present invention. Such as figure 1 As shown, the present invention is a convolutional neural network reasoning accelerator based on bidirectional pulsation and multi-stage pipelines, including:

[0037] The input image buffer module 101 includes N bu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a convolutional neural network inference accelerator and method. The accelerator comprises an input image buffer module which comprises N buffers for loading input image data,and N*N operation units which are connected to an input image buffer module and are used for performing convolution operations, wherein each of the buffers stores data of one row corresponding to an image, the N*N operation units support a pulsating form of the image data transmitted between adjacent operation units and are connected to the operation units in the buffers to read image data from the buffer, and remaining operation units read the image data from an adjacent operation unit. According to the convolutional neural network inference accelerator and the method, a bidirectional pulsation array is designed for data reusability brought by a convolutional neural network, the loading efficiency of the data is improved, and thus the convolutional neural network is accelerated.

Description

technical field [0001] The present invention relates to a dedicated acceleration framework for convolutional neural networks, in particular to a convolutional neural network reasoning accelerator and method based on bidirectional pulsation and multi-stage pipelines for accelerating its reasoning operation speed in the reasoning stage of convolutional neural networks . Background technique [0002] Convolutional neural network is a kind of feedforward neural network, which is often used in image recognition, and generally includes convolutional layer, pooling layer and fully connected layer. The convolution operation of the convolution layer is to multiply each weight in the convolution kernel with its corresponding input data point-to-point, and then accumulate the point multiplication results to obtain an output data, and then set according to the step size of the convolution layer Set, slide the convolution kernel, and repeat the above operation. [0003] At present, the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N5/04G06K9/00
CPCG06N5/04G06V10/95G06N3/045
Inventor 梁晓峣伍骏
Owner 上海岳芯电子科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products