Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network acceleration method and device, neural network acceleration chip and storage medium

A neural network and acceleration chip technology, applied in the field of artificial intelligence, can solve the problems of long acceleration time and low efficiency of neural network

Inactive Publication Date: 2019-05-21
DEEPBLUE TECH (SHANGHAI) CO LTD
View PDF11 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention provides a neural network acceleration method, device, neural network acceleration chip and storage medium to solve the problem of long neural network acceleration time and low efficiency in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network acceleration method and device, neural network acceleration chip and storage medium
  • Neural network acceleration method and device, neural network acceleration chip and storage medium
  • Neural network acceleration method and device, neural network acceleration chip and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] figure 1 A schematic diagram of a neural network acceleration process provided by an embodiment of the present invention, the process includes the following steps:

[0039] S101: For the neural network to be accelerated, perform the following steps until it is determined that the acceleration of the neural network is completed.

[0040] The neural network acceleration method provided by the embodiment of the present invention is applied to a neural network acceleration chip, and the neural network acceleration chip can be a GPU (Graphics Processing Unit, graphics processor), AI (Artificial Intelligence, artificial intelligence) chip, FPGA (Field-Programmable Gate Array, Field Programmable Gate Array) chips, or other chips capable of accelerating neural networks. Specifically, it may be a computing unit applied in a neural network acceleration chip.

[0041] The neural network acceleration chip stores an algorithm for accelerating processing of the neural network, so t...

Embodiment 2

[0063] On the basis of the above embodiments, in the embodiments of the present invention, if the current layer to be accelerated is the last layer, the parameters for scheduling the next layer of the current layer include:

[0064] Scheduling parameters for the first layer.

[0065] Since each layer of the neural network needs to be cyclically accelerated before the acceleration of the neural network is completed, if the current layer is the last layer, the first layer can be used as the next layer of the last layer for accelerated processing .

[0066] Therefore, the parameters for scheduling the next layer of the last layer are specifically the parameters for scheduling the first layer.

[0067] Because in the embodiment of the present invention, when the current layer is the last layer, the parameters of the first layer are scheduled as the parameters of the next layer of the last layer, which can ensure that the neural network is cycled layer by layer before the accelera...

Embodiment 3

[0069] On the basis of the above embodiments, in the embodiment of the present invention, the parameters for scheduling the next layer of the current layer include:

[0070] Scheduling the parameters of the next layer of the current layer stored in the on-chip memory.

[0071] In order to further improve the acceleration efficiency of the neural network, the parameters of each layer are pre-saved in the internal storage module of the neural network acceleration chip, that is, the on-chip memory, rather than in the external processor, so that the parameters of the next layer can be dispatched more quickly.

[0072] Specifically, the neural network acceleration chip can directly schedule the parameters of the next layer of the current layer in the on-chip memory, or indirectly schedule the parameters of the next layer of the current layer in the on-chip memory through other files, such as other files can be REG document.

[0073] The on-chip memory includes a read-only memory (...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network acceleration method and device, a neural network acceleration chip and a storage medium. The method comprises the following steps: for a to-be-accelerated neural network, carrying out the following steps until it is determined that the neural network acceleration is completed: carrying out the acceleration processing of a current layer through employing a to-be-accelerated parameter of the current layer, and scheduling a parameter of a next layer of the current layer; And when the acceleration processing of the current layer is completed, determining the next layer as the current layer to be accelerated to carry out acceleration processing. When the neural network acceleration chip carries out acceleration processing on the current layer of the neural network, parameters of the next layer of the current layer can be scheduled in parallel, the overall acceleration time of the neural network is shortened, and the acceleration efficiency of the neural network is improved.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a neural network acceleration method, device, neural network acceleration chip and storage medium. Background technique [0002] With the improvement of the accuracy of neural network algorithms represented by deep learning, the overall market size of artificial intelligence is gradually expanding, and the huge market potential has attracted many chip, algorithm and application manufacturers to join in. Because artificial intelligence needs a lot of calculations in model training and reasoning, it has been limited by its algorithm and calculation characteristics in the past, and traditional computing chips cannot meet the needs. Therefore, chip manufacturers have created special chips for neural network algorithms. Neural Network Accelerator. [0003] When the neural network accelerator is working, it needs to obtain the parameters of the network model layer by l...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/04
Inventor 陈海波
Owner DEEPBLUE TECH (SHANGHAI) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products