Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural-network computing system and methods

A computing system and neural network technology, applied in the field of artificial neural network processing, can solve problems such as performance bottlenecks, high power consumption overhead, and no multi-layer artificial neural network operations, so as to improve performance, performance and power consumption, and increase resources. The effect of utilization

Active Publication Date: 2018-08-21
CAMBRICON TECH CO LTD
View PDF3 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the GPU is a device specially used to perform graphics and image calculations and scientific calculations, without special support for multi-layer artificial neural network operations, it still requires a lot of front-end decoding work to perform multi-layer artificial neural network operations, which brings a lot of problems. additional cost
In addition, the GPU has only a small on-chip cache, and the model data (weights) of the multi-layer artificial neural network need to be repeatedly moved from off-chip. The off-chip bandwidth has become the main performance bottleneck, and it has brought huge power consumption overhead.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural-network computing system and methods
  • Neural-network computing system and methods
  • Neural-network computing system and methods

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] Embodiments and aspects of the disclosure will be described with reference to details discussed below, and the accompanying drawings will illustrate the embodiments. The following description and drawings are for illustrating the present disclosure and should not be construed as limiting the present disclosure. Numerous specific details are described to provide a thorough understanding of various embodiments of the present disclosure. However, in certain instances, well-known or common details are not described in order to provide a concise discussion of embodiments of the present disclosure.

[0050] "One embodiment" or "an embodiment" mentioned in the specification means that a specific feature, structure or characteristic described in conjunction with the embodiment can be included in at least one embodiment of the present disclosure. The appearances of the phrase "in one embodiment" in various places in this specification are not necessarily all referring to the sa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The disclosure discloses a neural-network computing system. The system includes: an I / O interface, which is used for I / O of data; a memory, which is used for temporarily storing a multi-layer artificial-neural-network model and neuron data; an artificial-neural-network chip, which is used for executing multi-layer artificial-neural-network operation and a back-propagation training algorithm thereof, wherein data and a program from a central processing unit (CPU) are accepted, and the above-mentioned multi-layer artificial-neural-network operation and the back-propagation training algorithm thereof are executed; the central processing unit CPU, which is used for data transportation and starting / stopping control of the artificial-neural-network chip, is used as an interface of the artificial-neural-network chip and external control, and receives results after execution of the artificial-neural-network chip. The disclosure also discloses a method of applying the above-mentioned system forartificial-neural-network compression encoding. According to the system, a model size of an artificial neural network can be effectively reduced, data processing speed of the artificial neural network can be increased, power consumption can be effectively reduced, and a resource utilization rate can be increased.

Description

technical field [0001] The present disclosure relates to the technical field of artificial neural network processing, and more particularly relates to a neural network computing system and method. Background technique [0002] Multi-layer artificial neural network is widely used in the fields of pattern recognition, image processing, function approximation and optimization calculation. Especially in recent years, due to the continuous deepening of research on backpropagation training algorithms and pre-training algorithms, multi-layer artificial neural networks have become more and more popular in academia and industry due to their high recognition accuracy and good parallelism. Widespread concern. [0003] With the sharp increase in the amount of calculation and memory access of artificial neural networks, general-purpose processors are usually used in the prior art to process multi-layer artificial neural network operations, training algorithms and compression coding, by ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/08G06F9/38
CPCG06F9/3887G06N3/063G06N3/084G06F7/4876G06F2207/4824G06N3/00
Inventor 陈天石刘少礼郭崎陈云霁
Owner CAMBRICON TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products