Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Convolution calculating apparatus and method for neural network

A neural network and computing device technology, applied in the field of neural network convolution computing devices, can solve the problems of high resource consumption and low utilization rate of read data, achieve small storage resource requirements, reduce data read and write operations, and improve computing efficiency effect

Active Publication Date: 2018-09-14
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF4 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to solve the above-mentioned problems in the prior art, that is, in order to solve the problems of large resource consumption and low utilization rate of read data in the convolution calculation process, one aspect of the present invention provides a convolution calculation device applied to neural networks , including: data input port, convolution computing unit CCU, line buffer space, pooling unit, data output port;

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolution calculating apparatus and method for neural network
  • Convolution calculating apparatus and method for neural network
  • Convolution calculating apparatus and method for neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] Preferred embodiments of the present invention are described below with reference to the accompanying drawings. Those skilled in the art should understand that these embodiments are only used to explain the technical principles of the present invention, and are not intended to limit the protection scope of the present invention.

[0059] Using the existing parallelization method to accelerate the convolutional neural network algorithm, the input and output data bandwidth requirements are large, the read data requires a large buffer space, and the temporary calculation results require a large amount of on-chip storage space, or through multiple data movements. off-chip memory. In order to solve the problems of large resource consumption and low utilization rate of read data in the convolution calculation process, the present invention provides a convolution calculation device and method applied to neural networks.

[0060] A convolution computing device applied to a neu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the digital signal processing field, and specifically relates to a convolution calculating apparatus and method for a neural network. The convolution calculating apparatus andmethod for a neural network aims at solving the problem that consumption of resources is great and the utilization rate of the read in data is low during the process of convolution calculation. The convolution calculating method for a neural network includes the steps: input data matrixes are processed in rows, and every two rows of data are input serially row by row into multiply accumulator arrays for carrying out multiply and accumulate operations; the multiply accumulator arrays perform deployment according to a convolution kernel dimension (M2, N2), and can process 2*M2*N2 times of multiplication in parallel; and by means of the convolution operation law, the two groups of multiply accumulator arrays can be shifted and added, and the data operation is accelerated. The convolution calculating apparatus and method for a neural network can excavate parallelism during the calculating process and improve the calculating efficiency of the system, and at the same time can reuse the input data and directly put the calculation results into a pooling unit, thus being able to reduce data reading and writing. Besides, the convolution calculating apparatus and method for a neural networkonly need one row cushion space, thus having a small demand for resources, can realize calculation of different dimension convolution, and have the advantages of calculating flexibility, universality,high effectiveness and low power consumption property.

Description

technical field [0001] The invention belongs to the field of digital signal processing, and in particular relates to a convolution computing device and method applied to neural networks. Background technique [0002] Convolution is an important operation in mathematics and is widely used in digital signal processing. [0003] Convolutions can be computed by time-domain or frequency-domain methods. The time-domain method mainly involves multiplication and addition operations. There is no data dependence and time correlation between data at different points, and it can be accelerated through a parallel computing structure. The frequency domain method converts the convolution sequence to the frequency domain through Fourier transform, then directly multiplies the frequency domain data, and finally inversely transforms to obtain the convolution calculation result. [0004] Convolutional Neural Network (CNN for short) is an important algorithm model in deep learning. In recent ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06N3/04
CPCG06N3/063G06N3/045
Inventor 陈亮刘丽
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products