Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Hardware Architecture of a Model Compression-Based Recurrent Neural Network Accelerator

A technology of recurrent neural network and hardware architecture, applied in biological neural network model, physical implementation and other directions, can solve the problem that recurrent neural network cannot meet the low power consumption and low latency of embedded systems, and achieve low power consumption and scalability. Strong, high-throughput effects

Active Publication Date: 2021-02-05
南京风兴科技有限公司
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technical problem to be solved by the present invention is to propose a hardware architecture of a recursive neural network accelerator based on model compression, so that the recurrent neural network can operate at low power consumption, Applications on real-time embedded systems are possible

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Hardware Architecture of a Model Compression-Based Recurrent Neural Network Accelerator
  • A Hardware Architecture of a Model Compression-Based Recurrent Neural Network Accelerator
  • A Hardware Architecture of a Model Compression-Based Recurrent Neural Network Accelerator

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] Embodiments of the present invention are described in detail below. This embodiment will contain definitions and descriptions of input and output variables of multiple hardware units, as well as specific examples cited to illustrate a certain function, which are intended to explain the present invention, but should not be construed as limitations on the present invention. Since the recurrent neural network includes many variants, this embodiment will not limit the specific variant type of the recurrent neural network, but only discuss the general case.

[0024] The basic unit of a recurrent neural network with n input and output nodes can be defined as:

[0025] h t =f(Wx t +Uh t-1 +b), (1)

[0026] Among them, h t ∈R n×1 is the hidden state (intermediate state) of the recurrent neural network at time t, and is also used as the output at time t; x t ∈R n×1 is the input vector at time t; W, U∈R n×n , b∈R n×1 is the model parameter of the recurrent neural networ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a hardware framework of a recursive neural network accelerator based on model compression. The hardware architecture includes the following parts: matrix multiplication and addition unit, which is used to realize the main matrix-vector multiplication operation in the neural network. This unit is composed of multiple multiplication and addition unit clusters, and each multiplication and addition unit cluster contains multiple The number of multiplication and addition unit blocks composed of addition units directly determines the parallelism and throughput of the accelerator; multiple double-ended on-chip SRAMs, three of which are used to store intermediate results generated during recurrent neural network calculations, and two A ping-pong storage structure is formed to improve data access efficiency, and the rest of the memory is used to store the parameters of the neural network; multiple nonlinear computing units are used to realize the nonlinear functions in the neural network; the control unit is used to generate relevant control signals and control The flow of data streams. The invention can realize high hardware efficiency and strong expandability, and is a reasonable scheme applicable to embedded systems in related fields such as intelligent human-computer interaction and robot control.

Description

technical field [0001] The invention relates to the field of computer and electronic information technology, in particular to a hardware architecture of a recursive neural network accelerator based on model compression. Background technique [0002] Recurrent neural network has powerful nonlinear fitting ability, and its natural recursive structure is very suitable for modeling sequence data, such as text, speech and video. At present, the recurrent neural network model has achieved an effect or accuracy close to or even surpassing that of humans in the field of natural language processing, especially in speech recognition and machine translation; It also has broad application prospects. These technologies are necessary to realize intelligent human-computer interaction, but there are many problems in running recurrent neural network models on embedded devices. On the one hand, the recurrent neural network model needs to store a large number of parameters, and the amount of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/063
Inventor 王中风王智生林军
Owner 南京风兴科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products