Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Calculation apparatus and method for accelerator chip accelerating deep neural network algorithm

A deep neural network and acceleration chip technology, applied in the field of computing devices for accelerating chips, can solve problems such as non-compliance, increase in chip power consumption, and increase in the number of times of reading intermediate values.

Inactive Publication Date: 2016-04-13
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF6 Cites 172 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

A major problem that will be faced is that a large number of intermediate values ​​are generated and need to be stored, so that the required main memory space increases
At the same time, this method increases the number of times the intermediate value is stored in the main memory or read from the main memory, and the power consumption of the chip increases, which does not conform to the low-power accelerator chip design concept described above.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Calculation apparatus and method for accelerator chip accelerating deep neural network algorithm
  • Calculation apparatus and method for accelerator chip accelerating deep neural network algorithm
  • Calculation apparatus and method for accelerator chip accelerating deep neural network algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] In order to make the object, technical solution and advantages of the present invention clearer, the computing device and method of the acceleration chip for accelerating the deep neural network algorithm of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0052] figure 1 It is a diagram of the relationship between each constituent module and the main memory of the operation device of the acceleration chip of the accelerated deep neural network algorithm of the present invention, the device includes a main memory 5, a vector addition processor 1, a vector function value operator 2 and a vector multiplication Adder 3. Among them, the vector addition processor 1 , the vector function value operator vector 2 and the vector multiply-adder 3 all have intermediate value storage areas 6 , 7 , 8 , and can read and write to the main memory 5 at the same time. Vector addition processor 1 is configured to carry out t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a calculation apparatus and method for an accelerator chip accelerating a deep neural network algorithm. The apparatus comprises a vector addition processor module, a vector function value calculator module and a vector multiplier-adder module, wherein the vector addition processor module performs vector addition or subtraction and / or vectorized operation of a pooling layer algorithm in the deep neural network algorithm; the vector function value calculator module performs vectorized operation of a nonlinear value in the deep neural network algorithm; the vector multiplier-adder module performs vector multiplication and addition operations; the three modules execute programmable instructions and interact to calculate a neuron value and a network output result of a neural network and a synaptic weight variation representing the effect intensity of input layer neurons to output layer neurons; and an intermediate value storage region is arranged in each of the three modules and a main memory is subjected to reading and writing operations. Therefore, the intermediate value reading and writing frequencies of the main memory can be reduced, the energy consumption of the accelerator chip can be reduced, and the problems of data missing and replacement in a data processing process can be avoided.

Description

technical field [0001] The invention belongs to the field of neural network algorithm and the field of computer hardware. More specifically, the present invention relates to a computing device and method of an acceleration chip for accelerating deep neural network algorithms. Background technique [0002] Artificial neural network algorithm is a research hotspot in the field of artificial intelligence since the 1980s. It abstracts the human brain neuron network from the perspective of information processing, establishes a simple model, and forms different networks according to different connection methods. It has a self-learning function, which can gradually learn to recognize and predict through training; associative storage function, with high algorithm robustness; high parallelism, with the ability to find optimal solutions at high speed, and can quickly find optimal solutions for complex problems of big data; It has strong plasticity and can fully approach any complex ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/06
CPCG06N3/063G06N3/084G06N3/045G06F17/16
Inventor 李震刘少礼张士锦罗韬钱诚陈云霁陈天石
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products