Check patentability & draft patents in minutes with Patsnap Eureka AI!

Convolutional neural network back propagation mapping method based on cyclic recombination and blocking

A convolutional neural network and backpropagation algorithm technology, applied in the field of on-chip retraining, can solve problems such as large calculation and storage capacity, scarcity, and increased difficulty in supporting software design

Pending Publication Date: 2021-05-25
SOUTHEAST UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] There are only a few neural network chips that support on-chip training. The biggest difficulty is that the training task requires a larger amount of computation and storage than forward inference. These two resources are very scarce in edge devices.
In addition, the internal data flow of different neural network accelerator chips is different. When performing reverse reasoning tasks, the corresponding software needs to map tasks for different accelerator architectures, which increases the difficulty of designing supporting software.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Convolutional neural network back propagation mapping method based on cyclic recombination and blocking
  • Convolutional neural network back propagation mapping method based on cyclic recombination and blocking
  • Convolutional neural network back propagation mapping method based on cyclic recombination and blocking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Usually in the backpropagation of convolutional neural network, it is divided into two steps, the first is error propagation, and the second is weight update. The process of the first error propagation is basically similar to the forward propagation, except that the corresponding input graph and weights have changed, and the scale of the convolution is consistent with the forward propagation. In the second step, when the weights are updated, the input map becomes the activation value tensor of the previous layer, and the weight becomes the error tensor of the current layer. The size of these two tensors may far exceed the scale of forward propagation. For example, the error tensor of the current layer as a weight may reach the size of 100 pixels*100 pixels (usually only 3*3 or 5*5 in the forward direction) ). Therefore, for such large convolutions, accelerators that only support forward direction are usually not well adapted.

[0030] The technical solution provided b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a convolutional neural network back propagation mapping method based on cyclic recombination and blocking, is suitable for an on-chip neural network training accelerator circuit, and belongs to the field of neural network accelerators. According to the method, a back propagation algorithm in neural network training is mapped to an existing neural network accelerator engine used for forward reasoning in a data scheduling mode, namely, large convolution operation in back propagation of a convolutional neural network is remapped; thus, the scale and dimension of the convolution can be adapted to a neural network accelerator for forward reasoning. According to the method, under the condition that a hardware architecture hardly needs to be changed, a specific large convolution operation of a back propagation algorithm is recombined and partitioned and is mapped to an existing neural network accelerator engine for forward reasoning; therefore, the accelerator which can only carry out forward reasoning originally can be easily adaptive to a back propagation algorithm, and then neural network training is carried out on a chip.

Description

technical field [0001] The invention belongs to the field of neural network accelerators, especially for on-chip retraining, and is used to adapt the backpropagation algorithm to the neural network accelerator that can only perform forward reasoning, so as to realize the training of the neural network on the chip without requiring Deploy after training on the server side. Background technique [0002] On-chip training refers to the process of retraining the neural network deployed on the neural network accelerator chip through a specific method. Traditional neural network accelerator chips usually need to train the model with big data through the server side, and then deploy it to the chip through the interface for forward reasoning. Therefore, neural network accelerator chips generally used for edge devices can only perform inference tasks and cannot perform training tasks. [0003] In recent years, the problem of user privacy leakage has become more and more serious. As...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N5/04G06N3/04
CPCG06N3/084G06N5/046G06N3/04
Inventor 单伟伟
Owner SOUTHEAST UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More