Computing array based neural network processor

A computing array and neural network technology, applied in the field of neural network processors, can solve the problems of large on-chip memory access bandwidth, increased memory access power consumption, and high computing circuit hardware overhead, reducing bandwidth requirements and improving computing efficiency.

Inactive Publication Date: 2018-04-17
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF6 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, since the neural network processor is a computationally intensive and memory-intensive processor, on the one hand, the neural network model includes a large number of multiplication and addition operations and other nonlinear operations, which require the neural network processor to maintain high load operation to ensure the neural network The computing requirements of the network model; on the other hand, there are a large number of parameter iterations in the neural network computing process, and the computing unit needs to access a lot of memory, which greatly increases the bandwidth design requirements of the processor and increases memory access power consumption.
[0005] Therefore, it is necessary to improve the existing neural network processors to solve the problems of high computing circuit hardware overhead and large on-chip memory access bandwidth.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computing array based neural network processor
  • Computing array based neural network processor
  • Computing array based neural network processor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] In order to make the purpose, technical solution, design method and advantages of the present invention clearer, the present invention will be further described in detail through specific embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0037] Such as figure 1 The general topological diagram of the neural network in the prior art is shown. The neural network is a mathematical model formed by modeling the structure and behavior of the human brain. It is usually divided into structures such as an input layer, a hidden layer, and an output layer. Each layer is It is composed of multiple neuron nodes, and the output value of the neuron nodes in this layer (referred to as neuron data or node value in this paper) will be passed as input to the neuron nodes in the next layer, and connected layer by layer. The neur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The present invention provides a neural network processor. The processor comprises at least one computing unit consisting of a host processor and a computing array, wherein the computing array is organized into a row and column two-dimensional matrix by a plurality of processing units, the host processor controls to load neuron data and weight values into the computing array, and the processing unit carries out multiply-accumulate operation on the received neuron data and weight values and passes neuron data and weight values in different directions to a next-level processing unit; and a control unit, for controlling the computing unit to carry out relevant computation of a neural network. By virtue of the computing device provided by the present invention, the computation speed of the neural network can be accelerated and the bandwidth requirements in the computation process can be reduced.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence, in particular to a neural network processor based on a computing array. Background technique [0002] Deep learning is an important branch of the field of machine learning, which has made major breakthroughs in recent years. The neural network model trained with deep learning algorithms has achieved remarkable results in image recognition, speech processing, intelligent robots and other application fields. [0003] The deep neural network simulates the neural connection structure of the human brain by building a model, and describes the data features hierarchically through multiple transformation stages when processing signals such as images, sounds, and texts. With the continuous improvement of the complexity of the neural network, the neural network technology has many problems in the actual application process, such as occupying a lot of resources, slow operation speed, and la...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/08
CPCG06N3/063G06N3/08
Inventor 韩银和许浩博王颖
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products