Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Ping-pong storage method and device for sparse neural network

A technology of ping-pong storage and neural network, which is applied in the direction of biological neural network model, neural architecture, input/output process of data processing, etc., to achieve the effect of ping-pong storage

Active Publication Date: 2020-11-13
南京风兴科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the above-mentioned technical problems existing in the prior art, the present invention provides a method and device for ping-pong storage of sparse neural networks, which solves the problem of determining switching points during ping-pong storage of sparse neural networks, and further realizes the ping-pong storage of sparse neural networks. storage

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Ping-pong storage method and device for sparse neural network
  • Ping-pong storage method and device for sparse neural network
  • Ping-pong storage method and device for sparse neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] In order to enable those skilled in the art to better understand the technical solutions in the embodiments of the present invention, and to make the above-mentioned purposes, features and advantages of the embodiments of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention are described below in conjunction with the accompanying drawings The program is described in further detail.

[0045] figure 1 is a schematic diagram of a ping-pong storage system for sparse neural networks. like figure 1 As shown, including the sparse processing unit in the general server and the control unit in the sparse neural network processor, two weight storage units M 0 and M 1 And the control unit, the sparse processing unit is used as the processing unit of the weight data. In addition to performing sparse processing on the weight data, it also adds configuration bits to the weight data during the processing process. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a ping-pong storage method and device for a sparse neural network, which solve the problem of great consumption of on-chip storage resources due to one-time loading of a largeamount of weight data during calculation of the sparse neural network. The ping-pong storage device comprises a sparse processing unit, a weight storage unit M0, a weight storage unit M1, a calculation unit and a control unit. The invention discloses a ping-pong storage method for a sparse neural network. Configuration bits are added into weight data subjected to sparse processing, M0 and M1 circularly store the weight data added with the configuration bits in a ping-pong storage mode, and the calculation unit calculates a switching point of each group of data. When the next group of data is uploaded completely, data switching is carried out when convolution calculation is carried out to a switching point each time, and the control unit is responsible for controlling each unit to completethe above work. According to the method, the problem of determining the switching point during ping-pong storage of the sparse neural network is solved, and ping-pong storage of the sparse neural network is further realized.

Description

technical field [0001] The present invention relates to the field of hardware-accelerated convolutional neural networks, and in particular to a ping-pong storage method and device for sparse neural networks. Background technique [0002] Convolutional neural networks (CNN, or Deepconvolutional neural networks, DCNN) are quite different from most other networks. They are primarily used for image processing, but can also be used for other types of input, such as audio. The sparsity of the neural network actually refers to the sparsity of the weights, which converts the samples into a suitable sparse expression form, thereby simplifying the learning task and reducing the complexity of the model, which is usually called sparse coding. "Sparseness" is defined as: only a few non-zero elements or only a few elements that are much greater than zero. There is a reason for choosing to represent our input data using components with sparsity, because the vast majority of sensory data,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/063G06F3/06
CPCG06N3/063G06F3/061G06F3/0629G06F3/0688G06N3/045
Inventor 陶为王中风林军王丹阳
Owner 南京风兴科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products