Dynamic multi-channel neural network SOC (system on a chip) and channel resource distribution method thereof

A neural network and multi-channel technology, applied in the field of artificial intelligence equipment, to achieve the effect of solving bandwidth problems

Active Publication Date: 2018-05-29
FUZHOU ROCKCHIP SEMICON
View PDF8 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is to provide a dynamic multi-channel neural network SOC chip and its channel resource allocation method, through

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic multi-channel neural network SOC (system on a chip) and channel resource distribution method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] see figure 1 As shown, the neural network SOC chip of the present invention includes a neural network circuit and a circuit for dynamically allocating channel resources;

[0027] The neural network circuit includes a plurality of neural network layers. Generally, a neural network circuit has hundreds of neural network layers, such as the neuron input layer, convolution layer, pooling layer, activation layer, and full neuron layer as shown in the figure. Connection layer, etc., each neural network layer has a data source path;

[0028] The circuit for dynamically allocating channel resources includes a plurality of source statistics units, a DDR access grouping unit, a grouping configuration storage unit, a DDR access path matrix unit, a plurality of terminal statistics units and a plurality of DDR channels; a plurality of the source statistics The units are respectively connected one by one to the data source paths of the neural network layers; the plurality of source ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a dynamic multi-channel neural network SOC (system on a chip) and a channel resource distribution method thereof. A dynamic distribution channel resource circuit is arranged additionally. The circuit comprises source counting units, a DDR (double data rate) visit grouping unit, a group configuration and storage unit, a DDR visit channel matrix unit, a terminal counting unitand a plurality of DDR channels. During working of the dynamic distribution channel resource circuit, each source counting unit counts data flow of each neural network layer and sends the data flow tothe DDR visit grouping unit. The DDR visit grouping unit judges input data size of each neural network layer and data sizes of the DDR channels, and adjusts connection relations among the DDR channels according to the data sizes to form new DDR channel using and grouping relations. The DDR visit channel matrix unit groups and connects data source channels and the DDR channels, so that dynamic channel resource distribution is realized, and the bandwidth problem of an artificial intelligent chip is solved effectively.

Description

technical field [0001] The invention relates to an artificial intelligence device, in particular to a neural network SOC chip and a channel resource allocation method thereof. Background technique [0002] With the rapid development of artificial intelligence technology, people have higher and higher performance requirements for artificial intelligence equipment. However, a major obstacle to the rapid development of deep learning neural network equipment is that the structure and operation of neural networks require a large amount of data movement. For example, data reading of neurons, weights, thresholds, and convolution kernels, intermediate calculation results of each layer of neural networks, error calculation and write-back during feedback training, and the final result caused the storage structure of the existing SOC chip to be overwhelmed. Memory bandwidth can easily become a performance bottleneck for deep learning neural networks. [0003] Therefore, the present i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F15/78G06N3/04G06F9/50
CPCG06F9/5011G06F15/7807G06N3/045
Inventor 廖裕民方金木
Owner FUZHOU ROCKCHIP SEMICON
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products