Neural network operation device and method

A neural network and computing device technology, applied in the information field, can solve problems such as insufficient memory access bandwidth, tight computing resources, and high power consumption, and achieve the effects of ensuring correctness and efficiency, solving insufficient computing performance, and reducing data volume

Active Publication Date: 2019-10-01
CAMBRICON TECH CO LTD
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the current neural network computing platform needs to set up a separate processing module for each type of neural network data, resulting in a shortage of computing resources, and related problems such as insufficient memory access bandwidth and high power consumption

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network operation device and method
  • Neural network operation device and method
  • Neural network operation device and method

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0044] In a first exemplary embodiment of the present disclosure, a neural network computing device is provided. Please refer to figure 1, the neural network computing device in this embodiment includes: a control unit 100 , a storage unit 200 , a sparse selection unit 300 and a neural network computing unit 400 . Wherein, the storage unit 200 is used for storing neural network data. The control unit 100 is configured to generate microinstructions respectively corresponding to the sparse selection unit and the neural network operation unit, and send the microinstructions to corresponding units. The sparse selection unit 300 is used to select the neural network corresponding to the effective weight value from the neural network data stored in the storage unit 200 according to the microinstruction corresponding to the sparse selection unit issued by the control unit and according to the position information represented by the sparse data therein. Data participates in operation...

no. 3 example

[0069] In a third exemplary embodiment of the present disclosure, a neural network computing device is provided. Compared with the second embodiment, the difference of the neural network computing device in this embodiment is that a dependency processing function is added in the control unit 100 .

[0070] Please refer to Figure 6, according to an embodiment of the present disclosure, the control unit 100 includes: an instruction cache module 110, configured to store a neural network instruction to be executed, the neural network instruction including address information of the neural network data to be processed; an instruction fetch module 120, It is used to obtain neural network instructions from the instruction cache module; the decoding module 130 is used to decode the neural network instructions to obtain microinstructions corresponding to the storage unit 200, the sparse selection unit 300 and the neural network operation unit 400 respectively, Include the address inf...

no. 5 example

[0083] Based on the neural network computing device of the third embodiment, the present disclosure further provides a sparse neural network data processing method for performing sparse neural network computing according to computing instructions. Such as Figure 8 As shown, the processing method of the sparse neural network data in this embodiment includes:

[0084] Step S801, the instruction fetching module fetches the neural network instruction from the instruction cache module, and sends the neural network instruction to the decoding module;

[0085] Step S802, the decoding module decodes the neural network instruction to obtain microinstructions respectively corresponding to the storage unit, the sparse selection unit and the neural network operation unit, and sends each microinstruction to the instruction queue;

[0086] Step S803, obtaining the neural network operation code and the neural network operation operand of the microinstruction from the scalar register file, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a neural network operation device and method. The neural network operation device comprises a control unit, a storage unit, a sparse selection unit and a neural network operation unit, wherein the control unit is used for generating micro instructions respectively corresponding to the units and sending the micro instructions to the corresponding units; the sparse selection unit is used for selecting neural network data corresponding to the effective weight value from the neural network data stored in the storage unit to participate in operation according to the micro instruction corresponding to the sparse selection unit and issued by the control unit and the position information represented by the sparse data in the micro instruction; and the neural network operation unit is used for executing neural network operation on the neural network data selected by the sparse selection unit according to the micro instruction which is issued by the control unit and corresponds to the neural network operation unit to obtain an operation result. The capability of processing different data types of the neural network operation device can be improved, and power consumption is reduced while the neural network operation speed is increased.

Description

technical field [0001] The present disclosure relates to the field of information technology, and in particular to a neural network computing device and method compatible with general neural network data, sparse neural network data and discrete neural network data. Background technique [0002] Artificial Neural Networks (ANNs), referred to as Neural Networks (NNs), is an algorithmic mathematical model that imitates the behavioral characteristics of animal neural networks and performs distributed parallel information processing. This kind of network depends on the complexity of the system, and achieves the purpose of processing information by adjusting the interconnection relationship between a large number of internal nodes. At present, neural networks have made great progress in many fields such as intelligent control and machine learning. With the continuous development of deep learning technology, the scale of the current neural network model is getting larger and large...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063
CPCG06N3/063
Inventor 陈天石刘少礼陈云霁
Owner CAMBRICON TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products