Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Recurrent neural network sparse connection method based on block tensor decomposition

A technology of cyclic neural network and tensor decomposition, applied in neural learning methods, biological neural network models, etc., can solve problems such as ignoring high-dimensional facts and redundant characteristics of full connections, and achieve the goal of improving sharing and training speed Effect

Inactive Publication Date: 2018-03-13
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to solve the problem that the existing cyclic neural network structure ignores the high-dimensional fact of the input data and the redundant characteristics of the full connection, the present invention proposes a sparse connection method of the cyclic neural network based on block tensor decomposition, and simultaneously based on the input data Analysis and optimization of the high-dimensional facts and the redundant characteristics of the full connection itself, compared with the prior art, the convergence accuracy of the present invention is greatly improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Recurrent neural network sparse connection method based on block tensor decomposition
  • Recurrent neural network sparse connection method based on block tensor decomposition
  • Recurrent neural network sparse connection method based on block tensor decomposition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to facilitate those skilled in the art to understand the technical content of the present invention, the content of the present invention will be further explained below in conjunction with the accompanying drawings.

[0025] like figure 1 Shown is the scheme flowchart of the present invention, and the technical scheme of the present invention is: the cyclic neural network sparse connection method based on block tensor decomposition, comprising:

[0026] S1. Tensorize the input vector x of the network to obtain the first tensor Quantize the memory vector h to get the second tensor Quantize the fully connected weight matrix W to get the third tensor

[0027] Suppose the input vector memory vector Fully connected weight matrix constructed tensor and is a d-dimensional tensor, is a 2d-dimensional tensor, where I=I 1 ·I 2 ·...·I d , J=J 1 ·J 2 ·...·J d . The tensorization operation in the present invention refers to rearranging the elements...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a recurrent neural network sparse connection method based on block tensor decomposition. The recurrent neutral network sparse connection method is applied to the field of deeplearning neural network structure optimization and solves the problem that an existing method cannot perform analysis and optimization based on high-dimensional facts of input data and self redundancyof full connection and cannot quicken a training speed and maintaining or improve model accuracy at the same time. According to the method disclosed by the invention, sparsity of a full connection layer in a deep network is taken into consideration, an tensor decomposition idea is introduced, a sharing degree of network parameters is improved, a BPTT is utilized to perform model training, and themethod is adapted to most application scenes of an existing deep network; compared with an existing full-connection mode, a training speed and a convergence precision of the method disclosed by the invention are more greatly improved.

Description

technical field [0001] The invention belongs to the field of deep learning neural network structure optimization, and in particular relates to a design technology of a sparse connection method of a cyclic neural network based on block tensor decomposition. Background technique [0002] Recurrent neural network (RNN) has been widely used in the field of time series information processing, such as speech recognition, text translation, video classification, etc. In the traditional neural network model, the data flows from the input layer to the hidden layer and then to the output layer, the layers are fully connected, and the nodes between each layer are not connected. But this kind of neural network cannot capture time series information. In the RNN network, the preorder information in the time series is memorized and applied to the calculation of the current output, that is, the nodes between the hidden layers are connected, and the input of the hidden layer includes not onl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08
CPCG06N3/084
Inventor 徐增林叶锦棉李广西陈迪
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products