Unlock instant, AI-driven research and patent intelligence for your innovation.

A training method for neural network of speech recognition device

A neural network and training method technology, applied in the field of cyclic neural network and cyclic neural network training, can solve the problems that the training cannot converge, the training effect is poor, and the performance of the cyclic network cannot be further improved, so as to improve the training effect and performance. The effect of network depth

Active Publication Date: 2021-04-09
SHANGHAI YITU NETWORK SCI & TECH
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] The disadvantage of the existing cyclic neural network composed of LSMT network layers 102 is that only about 2 layers of cyclic neural network can be used; when the number of layers is increased, the training will fail to converge, or the training effect will be significantly worse than that of the shallow network, thus Cannot further improve the performance of recurrent networks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A training method for neural network of speech recognition device
  • A training method for neural network of speech recognition device
  • A training method for neural network of speech recognition device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] Such as figure 2 As shown, it is a model structure diagram of the speech recognition device of the embodiment of the present invention; the cyclic neural network of the embodiment of the present invention includes:

[0050] Baseline model, formed by layer 2 connections of a 2-layer LSTM network.

[0051] An extended model, the extended model includes a multi-layer residual network layer 3, the residual network layer 3 of each layer is formed by connecting a layer of LSTM network layer 2 and an additive function layer, and the residual network layer 3 of the The input end is connected to the output of the upper layer of network layer, and the two input ends of the addition function layer are respectively connected to the output of the LSTM network layer 2 of the residual network layer 3 and the output of the upper layer of network layer, and the addition function The output of the layer is used as the output of the residual network layer 3.

[0052] The depth of the r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cyclic neural network, comprising: a baseline model, which is formed by connecting two layers of LSTM network layers; an extended model, which includes a multi-layer residual network layer, and the residual network layer of each layer is composed of a layer of LSTM network layer It is formed by connecting with an additive function layer, the input of the residual network layer is connected to the output of the previous network layer, and the two inputs of the additive function layer are respectively connected to the output of the LSTM network layer of the residual network layer and the previous network layer The output of the layer, the output of the additive function layer is used as the output of the residual network layer. The invention also discloses a training method of the cyclic neural network. The invention can deepen the depth of the LSTM network layer recurrent neural network, and can improve the training effect and performance.

Description

technical field [0001] The present invention relates to speech recognition, in particular to a recurrent neural network. The invention also relates to a training method for the cyclic neural network. Background technique [0002] Such as figure 1 Shown is a model structure diagram of an existing speech recognition device; an existing recurrent neural network (RNN) is formed by connecting two layers of long short-term memory (LSTM) network layers 102 . [0003] figure 1 In , the recurrent neural network is used in a speech recognition device. [0004] The speech recognition device includes: a convolution layer (convolution layer) 101 , the cyclic neural network, a fully connected layer (Fully connected Layer) 103 and a layer 104 based on a Connectionist Temporal Classification (CTC) layer. [0005] The convolutional layer 101 receives a sound frequency spectrum signal, the output of the convolutional layer 101 is connected to the cyclic neural network, and the cyclic deep...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/044G06N3/045
Inventor 康燕斌张志齐
Owner SHANGHAI YITU NETWORK SCI & TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More