Unlock instant, AI-driven research and patent intelligence for your innovation.

Tree recurrent neural network algorithm with attention mechanism batch training

A technology of cyclic neural network and attention, which is applied in the field of deep learning in computer science, to achieve the effect of accelerating training efficiency, preventing gradient explosion, and improving training accuracy

Pending Publication Date: 2022-01-14
众微致成(北京)信息服务有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At this stage, for the virus monitoring of large-scale farms, there is no technical solution to use drone inspections to monitor the inside and outside of the farm for patrol virus sampling and monitoring; the isolation, collection and monitoring of farm entrances through non-contact collection still needs to be further resolved; how to Problems such as remote temperature measurement of breeding groups, how to conduct comprehensive non-contact collection and testing, and how to monitor breeding groups still need to be resolved; how to carry out intelligent control based on monitoring results, and timely, safe and thorough treatment to remove viruses still needs to be improved; therefore, It is necessary to propose a tree-recurrent neural network algorithm with batch training of attention mechanism to at least partially solve the problems existing in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Tree recurrent neural network algorithm with attention mechanism batch training
  • Tree recurrent neural network algorithm with attention mechanism batch training
  • Tree recurrent neural network algorithm with attention mechanism batch training

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0075] The present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments, so that those skilled in the art can implement it with reference to the description. Such as Figure 1-6 As shown, the present invention provides a kind of tree recurrent neural network algorithm with attention mechanism batch training, comprising:

[0076] S100, selecting sequence batch data of arbitrary length, and importing the recurrent neural network RNN;

[0077] S200. Based on the recurrent neural network RNN, establish a long-term short-term memory network that depends on the gate vectors and memory units of all child units, and obtain a tree long-term short-term memory network;

[0078] S300, based on the tree-based long-term short-term memory network, adding an attention mechanism to obtain an attention-mechanism long-term short-term memory network;

[0079] S400. Perform batch training on sequence batch data through an attention mec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a tree recurrent neural network algorithm with attention mechanism batch training, and the algorithm comprises the steps: selecting sequence batch data of any length, and importing the data into a recurrent neural network (RNN); based on the RNN, establishing a long-short term memory network depending on gate vectors and memory units of all filial generation units, and obtaining a tree long-short term memory network; adding an attention mechanism based on the tree long-short term memory network to obtain an attention mechanism long-short term memory network; and carrying out batch training on sequence batch data through the attention mechanism long-short term memory network, and accelerating the training.

Description

technical field [0001] The invention relates to the field of deep learning in computer science, and in particular to the accelerated training of neural networks in deep learning. More specifically, the invention relates to a tree recurrent neural network algorithm with batch training of an attention mechanism. Background technique [0002] At this stage, for the virus monitoring of large-scale farms, there is no technical solution to use drone inspections to monitor the inside and outside of the farm for patrol virus sampling and monitoring; the isolation, collection and monitoring of farm entrances through non-contact collection still needs to be further resolved; how to Problems such as remote temperature measurement of breeding groups, how to conduct comprehensive non-contact collection and testing, and how to monitor breeding groups still need to be resolved; how to carry out intelligent control based on monitoring results, and timely, safe and thorough treatment to remov...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/048G06N3/045G06N3/044
Inventor 王然
Owner 众微致成(北京)信息服务有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More