Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for constructing incremental LSTM by utilizing training process compression and memory consolidation

An incremental, memory technology, applied in neural learning methods, neural architectures, biological neural network models, etc., can solve problems such as reducing storage data space overhead, reduce huge space overhead, improve training efficiency, and ensure practicality. Effect

Pending Publication Date: 2020-07-10
JIANGSU UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the deficiencies in the prior art, this application proposes a method of constructing an incremental LSTM using training process compression and memory consolidation to solve the problem that the existing LSTM cannot effectively perform incremental learning of sequence data and reduce the space for storing data Overhead, while avoiding the forgetting of historical information caused by the learning of new data, improving the practicability of LSTM

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for constructing incremental LSTM by utilizing training process compression and memory consolidation
  • Method for constructing incremental LSTM by utilizing training process compression and memory consolidation
  • Method for constructing incremental LSTM by utilizing training process compression and memory consolidation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0022] Such as figure 1 The shown method for constructing an incremental LSTM using training process compression and memory consolidation includes the following steps:

[0023] Step 1. In order to adapt to incremental learning, the sequence data is divided into several batches, and the incremental training of LSTM is completed batch by batch to reduce training overhead. The specific process is: the data preparation module divides the sequence data S into N subsequence data sets {S 1 ,S 2 ,S 3 ,...,S N}, where the i-th subsequence data set represents For the i-th subsequence data set S i T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for constructing incremental LSTM by utilizing training process compression and memory consolidation. According to the method, an important input moment in the training process is selected by utilizing the activity of the LSTM gate unit, the training process is compressed to form historical memory, the compressed memory is effectively fused with training on new data, and the network memory is consolidated by utilizing historical information so as to meet the incremental processing requirement on the sequence data. According to the method, the problems of high storage overhead and low training efficiency of an existing LSTM system can be avoided, the training efficiency of the LSTM for dynamically increased sequence data is improved, forgetting of historicalinformation is avoided, and the effectiveness and practicability of the LSTM are guaranteed.

Description

technical field [0001] The invention relates to the field of deep learning of artificial intelligence, in particular to a method for constructing an efficient incremental LSTM by using training process compression and memory consolidation. Background technique [0002] In recent years, with the continuous development of new artificial intelligence technology and the explosive growth of massive data, how to process and analyze these data efficiently, accurately and quickly with the help of new technologies, and tap the huge value contained in them has become a challenging task . Recurrent neural network is a popular deep learning model suitable for the analysis and modeling of sequence data. Long short-term memory network (LSTM) is a variant of cyclic neural network. By adding a gating mechanism to the memory unit, it overcomes the problem of gradient disappearance in traditional cyclic neural network training. It is the most widely used cyclic neural network structure. [...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/084G06N3/044
Inventor 牛德姣夏政蔡涛周时颉杨乐
Owner JIANGSU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products