Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Time-division-multiplexing general neural network processor

A neural network and time-division multiplexing technology, which is applied in the field of accelerator and processor architecture and design, can solve the problems of inapplicability of neural network, poor versatility, and inability to run neural network algorithms, etc., to solve hardware overhead and power consumption. Versatile effect

Active Publication Date: 2015-12-23
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF3 Cites 92 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method has high computational efficiency, but its shortcomings are also prominent: on the one hand, with the popularity of deep learning, the scale of neural networks used in practical applications is getting larger and larger, and the method of topological mapping is difficult to apply; on the other hand, Accelerators designed using topological mapping methods are only suitable for neural networks with a specific structure, but not for neural networks with other structures, that is, different neural network algorithms cannot be run on the same hardware accelerator.
[0005] In summary, the existing neural network accelerators are not suitable for computing large-scale neural networks and have poor versatility

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Time-division-multiplexing general neural network processor
  • Time-division-multiplexing general neural network processor
  • Time-division-multiplexing general neural network processor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below through specific embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0030] According to an embodiment of the present invention, a general neural network processor with time division multiplexing is provided.

[0031] In a nutshell, the general-purpose neural network processor (hereinafter referred to as the processor) provided by the present invention adopts a structure based on storage-control-computation. in:

[0032] The storage includes: a storage unit for storing instructions and data; a storage unit controller for controlling the reading and writing of the storage unit according to the access address; and an input and output interface for exchan...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Provided in the invention is a time-division-multiplexing general neural network processor comprising at least one storage unit (100), at least one storage unit controller (101), at least one arithmetic logic unit (103), and a control unit (102). To be specific, the at least one storage unit (100) is used for storing an instruction and data. The at least one storage unit controller (101) corresponds to the at least one storage unit (100) and accesses the corresponding storage unit (100). The at least one arithmetic logic unit (103) is used for executing neural network computing. The control unit (102) connected with the at least one storage unit controller (101) and the at least one arithmetic logic unit (103) can obtain the instruction stored by the at least one storage unit (100) by the at least one storage unit controller (101) and parse the instruction to control the at least one arithmetic logic unit (103) to execute computation. The provided general neural network processor with high universality is suitable for computation of a large-scale neural network.

Description

technical field [0001] The present invention relates to an accelerator and a processor architecture and a design method, in particular to a hardware acceleration technology for an artificial neural network (Artificial Neural Network, referred to as ANN). Background technique [0002] Artificial neural network, referred to as neural network, is a computing model consisting of a large number of nodes (or neurons) connected to each other. Each node represents a specific output function, also known as an activation function (activation function), which can be a linear function, a ramp function, a threshold function, a sigmoid function, a bipolar sigmoid function, and the like. The connection between each two nodes represents a weight for the signal passing through the connection, called weight, which is equivalent to the memory of the neural network. The output of the neural network is different according to the connection method, weight and activation function of the network. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063
Inventor 韩银和王颖
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products