Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Run length coding accelerator and method for sparse CNN neural network model

A neural network model and run-length coding technology, which is applied in the field of digital signal processing and hardware accelerated neural network algorithm, can solve the problem of large memory energy consumption, and achieve the effects of saving power consumption and computing power, high energy efficiency, and reducing data scale

Pending Publication Date: 2022-01-07
NANJING UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to overcome the problem of a large amount of memory access and a large amount of energy consumption in the process of calculating the neural network in the memory, the present invention proposes an accelerator using run-length coding and its acceleration method for the sparse CNN model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Run length coding accelerator and method for sparse CNN neural network model
  • Run length coding accelerator and method for sparse CNN neural network model
  • Run length coding accelerator and method for sparse CNN neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The present invention will be described in detail below in conjunction with the accompanying drawings.

[0031] Such as figure 1 Shown is a structural block diagram of a run-length coding accelerator for a sparse CNN neural network model. The accelerator includes: a top-level controller for identifying input and output data types and performing group processing, and passing inputs such as weight data and excitation data to the run-length Encoding module; run-length encoding module, used to compress the result data output by computing module, and transmit the calculated result after compression encoding to external memory (DRAM); run-length decoding module, used to decompress the data read from external memory, And transmit the data to the data gating; the data gating is used to identify the zero value in the decoded input excitation data and weight data, record the position of the zero value in the data expansion into the vector array, and skip the multiplication and ad...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a run length coding accelerator for a sparse CNN neural network model and a method thereof. The accelerator comprises a top layer controller which is used for identifying the type of input and output data, carrying out grouping processing, and transmitting weight data and excitation data to a run length coding module; a run length encoding module which is used for compressing result data output by the calculation module and transmitting a calculation result after compression encoding to an external memory; a run decoding module which is used for decompressing the data read from the external memory and transmitting the data to the data gating; a data gating which is used for identifying zero values of the input excitation data and the weight data and skipping a multiplication and addition operation for the zero values; and a calculation module which is used for executing multiplication and addition operation on the weight data and the excitation data transmitted by the data gating and giving a calculation result. Storage space and power consumption required by the CNN model are reduced on the basis of data compression and data gating, so that the method has the advantages of high energy efficiency, no need of accessing a large amount of memory and the like.

Description

technical field [0001] The invention relates to a run-length coding accelerator and a method thereof for a sparse CNN neural network model, and belongs to the field of digital signal processing and hardware accelerated neural network algorithms. Background technique [0002] Neural network is currently a research hotspot in the field of artificial intelligence. LeNet is one of the earliest convolutional neural networks and has promoted the development of deep learning. Due to the influence of computer performance, although LeNet has achieved good results in image classification, it has not attracted attention. CNN deepens the network structure on the basis of LeNet and learns richer and higher-dimensional image features. CNN uses a deeper network structure, namely convolutional layer + convolutional layer + pooling layer to extract the features of the image, and replaces the previous sigmoid with Relu as the activation function. On ImageNet LSVRC-2010 in 2010, CNN achieved...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/063G06N3/08
CPCG06N3/063G06N3/08G06N3/045
Inventor 王宇宣苏杭潘红兵彭成磊
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products