Check patentability & draft patents in minutes with Patsnap Eureka AI!

Neural network data serial pipeline processing method facing artificial intelligence calculation

A neural network and pipeline processing technology, applied in the computer field, can solve problems such as power consumption delay, large data volume, etc., to achieve the effect of reducing a lot of power consumption and delay, and improving processing efficiency

Active Publication Date: 2018-08-24
江苏金羿智芯科技有限公司
View PDF8 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The calculation method of the neural network model is essentially a multiplication and addition operation of small matrices. The huge parallelism brings a huge amount of data, and the continuous reading and writing of data and parameters from external storage is its biggest bottleneck, which will generate a lot of power consumption and delay.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network data serial pipeline processing method facing artificial intelligence calculation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0029] Such as figure 1 As shown, the embodiment of the present invention discloses a neural network data serial pipeline processing method for artificial intelligence computing, which may include the following steps 101 to 104:

[0030] 101. After receiving the initial data, the layer data processing module of the first layer of the neural ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a neural network data serial pipeline processing method facing artificial intelligence calculation. The method comprises the steps of: after a layer data processing module at a first layer of a neural network receives initial data, performing in-layer parallel operation; after a layer data processing module at a middle layer of the neural network receives an operation result being subjected to serial output by a last layer, performing in-layer parallel operation; and finally, after a layer data processing module of a final layer receives an operation result being subjected to serial output by a laser layer, performing in-layer parallel operation, and performing serial output of an operation result, wherein if the initial data is input for many times, all the layer data processing modules are employed to perform pipeline processing of the initial data input for many times. The neural network data serial pipeline processing method allow the layers of theneural network to respectively respond to different layer data processing modules, and each layer data processing module performs in-layer parallel operation for this layer of data without external interaction so as to reduce the problems that a lot of power loss and delay are generated caused by continually storing read-write data and parameters from the outside, and the multi-time input data canbe subjected to pipeline processing to improve the processing efficiency of the data of the neural network.

Description

technical field [0001] The embodiment of the present invention relates to the field of computer technology, and in particular to a neural network data serial pipeline processing method oriented to artificial intelligence calculation. Background technique [0002] Neural network is the most widely used tool in the field of artificial intelligence. There are many types of neural networks. Taking deep convolutional neural network as an example, different models of deep convolutional neural networks have different numbers of layers, but there are six main types of calculations. There are three methods: full connection, convolution, pooling, nonlinearity, vector operation and matrix addition. These six methods are mature technologies and will not be described in this article. [0003] The calculation method of the neural network model is essentially the multiplication and addition operation of small matrices. The huge parallelism brings a huge amount of data, and the continuous r...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 陈明书
Owner 江苏金羿智芯科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More