GRU based recurrent neural network multi-label learning method

A recursive neural network and neural network technology, applied in the field of multi-label learning of recurrent neural network based on GRU, can solve the problems of complex structure, disappearing gradient, and inability to effectively learn the basic characteristics of samples, and achieve simple structure and improved The effect of accuracy

Pending Publication Date: 2018-07-24
HRG INT INST FOR RES & INNOVATION
View PDF0 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The existing neural network multi-label classification cannot effectively learn the basic characteristics of the sample, the implementation structure is complex, and the gradient disappearance problem is prone to occur during backpropagation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • GRU based recurrent neural network multi-label learning method
  • GRU based recurrent neural network multi-label learning method
  • GRU based recurrent neural network multi-label learning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to make the technical solution of the present invention clearer and clearer, the following will be further described in detail with reference to the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the present invention, and are not intended to limit the present invention.

[0034] Figure 4 is a schematic diagram of the GRU-based RNN multi-label classifier in the present invention, through which the context vector h can be obtained T , and then output through the softmax layer Then use the multi-vector label y i Construct the loss function L of the i-th sample label pair i , the specific implementation process is as follows:

[0035] Suppose the sample label pair Contains N training samples, where the sample the y i is sample x i The multi-label vector of , We take the sample x i Do normalization so that its value is in [0, 1]. first put x i Zero mean, and then use variance stan...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a GRU based recurrent neural network multi-label learning method, which comprises the steps of S1, initializing a system parameter [theta]=(W, U, B); S2, inputting a sample {xi,yi}<i=1><N>, calculating a hidden state hT of the output of an RNN (Recurrent Neural Network) at each moment, wherein the sample xi belongs to R<M*1>, yi is a multi-label vector of the sample xi, andyi belongs to R<C*1>; S3, calculating a context vector hT and output zi of an output layer; S4, calculating the predicted output yi^, calculating the loss Li, and determining an objective function J;S5, solving the gradient of the system parameter [theta]=(W, U, B) according to a gradient descent method and a BPTT (Back-propagation Through Time) algorithm; S6, determining a learning rate [eta],and updating each weight gradient W=W-[eta]*[delta]W; S7, judging whether the neural network reaches stable or not, if so, executing a step S8, if not, returning the step S2, and iteratively updatingmodel parameters; and S8, outputting an optimization model. According to the invention, effective feature representation of the sample can be obtained by fully utilizing the RNN so as to improve the accuracy of multi-label classification. In addition, a problem of gradient disappearance is not easy to occur in back propagation.

Description

technical field [0001] The invention relates to the field of recurrent neural network multi-label learning, in particular to a GRU-based recurrent neural network multi-label learning method. Background technique [0002] In the field of machine learning, multi-label classification plays an important role in classification problems. The traditional label classification problem is to learn a single label for an example from a set of labels. Such a problem is called a binary classification problem (or a text and network data filtering problem). Each example in a multi-label classification problem has a set of associated labels. In solving the problem of multi-label classification, many methods have emerged in recent years. The methods are basically divided into two types. One is to convert the problem into a traditional single-label classification. This method learns each label in the label set separately through multiple binary classifiers. The second is to adjust existing ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/08
CPCG06N3/084G06F18/214
Inventor 王磊翟荣安王毓王纯配刘晶晶王飞于振中李文兴
Owner HRG INT INST FOR RES & INNOVATION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products