Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network acceleration circuit and method

A technology of neural network and acceleration circuit, which is applied in the field of neural network, can solve the problems of insufficient signal establishment time and long time, achieve high computing parallelism and improve computing power

Active Publication Date: 2020-04-03
SHENZHEN CORERAIN TECH CO LTD
View PDF4 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] This kind of neural network acceleration circuit requires that the data output by the input data RAM and the weight RAM reach all computing modules at the same clock cycle. The weight RAM is far away, and it will take longer for the corresponding data to reach these computing modules. As a result, when the clock is high, these long traces will cause the signal setup time to be insufficient, thus limiting the highest circuit that can work. Clock frequency; and to enable the circuit to work at a higher clock frequency, timing requirements limit the computational parallelism of the circuit

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network acceleration circuit and method
  • Neural network acceleration circuit and method
  • Neural network acceleration circuit and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] figure 2 It is a schematic structural diagram of a neural network acceleration circuit provided in Embodiment 1 of the present invention, which is applicable to the calculation of neural networks. Such as figure 2 As shown, a neural network acceleration circuit provided by Embodiment 1 of the present invention includes: a data storage module 100 , a data cache module 200 , a calculation module 300 and a delay processing module 400 .

[0033] Specifically, the data storage module 100 is used for storing input data required for neural network calculation. Neural network is a complex network system formed by extensive interconnection of a large number of simple processing units (also called neurons), which reflects many basic characteristics of human brain function and is a highly complex nonlinear dynamic learning system. Calculation of a neural network usually requires a large amount of input data, and in the neural network acceleration circuit, these input data are...

Embodiment 2

[0039] image 3 It is a schematic structural diagram of a neural network acceleration circuit provided by Embodiment 2 of the present invention. This embodiment is a further refinement of the foregoing embodiments. Such as image 3 As shown, a neural network acceleration circuit provided by Embodiment 2 of the present invention includes: a data storage module 100, a data cache module 200, a calculation module 300, and a delay processing module 400, wherein the data storage module 100 includes a first data storage module module 110 and a second data storage sub-module 120, the first data storage sub-module 110 includes a first data storage unit 111 and a first control unit 112, and the second data storage sub-module 120 includes a second data storage unit 121 and a second control unit Unit 122, the data cache module 200 includes a first register unit 210 and a second register unit 220, the first register unit 210 includes n first registers 211_1~211_n, and the second register ...

Embodiment 3

[0051] Figure 4 It is a schematic flowchart of a neural network acceleration method provided in Embodiment 3 of the present invention, which is applicable to the calculation of neural networks. A neural network acceleration method provided in this embodiment may be implemented by a neural network acceleration circuit provided in any embodiment of the present invention. For content not described in detail in Embodiment 3 of the present invention, reference may be made to the description in any system embodiment of the present invention.

[0052] Such as Figure 4 As shown, a neural network acceleration method provided by Embodiment 3 of the present invention includes:

[0053] S410. Acquire input data required for neural network calculation.

[0054] Specifically, the neural network is a complex network system formed by a large number of simple processing units (also called neurons) that are widely connected to each other. It reflects many basic characteristics of human bra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a neural network acceleration circuit and method. The neural network acceleration circuit comprises a data storage module which is used for storing the inputdata needed by the calculation of a neural network, a data caching module which is used for caching the input data output by the data storage module, a calculation module which comprises a plurality of calculation units which are used for calculating the input data output by the data caching module to obtain output data, and a delay processing module which is used for carrying out delay processingon the output data and simultaneously outputting the output data after delay processing. According to the neural network acceleration circuit provided by the embodiment of the invention, the contradiction between the time sequence and the calculation parallelism degree in the neural network acceleration circuit is solved, so that the neural network acceleration circuit can have high calculation parallelism degree when working at a high clock frequency, and the calculation capability of the neural network acceleration circuit is improved.

Description

technical field [0001] Embodiments of the present invention relate to the field of neural networks, and in particular, to a neural network acceleration circuit and method. Background technique [0002] In recent years, neural networks have developed rapidly and are widely used in computer vision and natural language computing. Neural network accelerators are characterized by high energy efficiency and massively parallel computing, and have gradually become a hot research topic. [0003] The neural network acceleration circuit usually uses a high degree of parallelism to quickly complete the massive calculation tasks required by the neural network algorithm. Due to the regularity of the calculation form, the acceleration circuit will first design the basic calculation unit, and use this calculation unit to implement the algorithm. The basic operations, and then a large number of copies of this computing unit to achieve a high degree of computing parallelism. figure 1 It sho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063
CPCG06N3/063Y02D10/00G06N3/0464
Inventor 焦黎李远超蔡权雄牛昕宇
Owner SHENZHEN CORERAIN TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products