Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A meta-learning algorithm based on stepwise gradient correction of a meta-learner

A gradient correction and meta-learning technology, applied in the field of deep neural networks, can solve problems such as difficulty in reproduction, construction of complex meta-learners, and incomprehension of meta-learning algorithms

Pending Publication Date: 2019-06-21
XI AN JIAOTONG UNIV
View PDF2 Cites 29 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in order to ensure a strong hyperparameter learning ability, current methods generally need to construct very complex meta-learners.
This makes these meta-learning algorithms hard to understand and hard to reproduce

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A meta-learning algorithm based on stepwise gradient correction of a meta-learner
  • A meta-learning algorithm based on stepwise gradient correction of a meta-learner
  • A meta-learning algorithm based on stepwise gradient correction of a meta-learner

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] The implementation of the present invention will be described in detail below in conjunction with the drawings and examples.

[0064] Such as figure 1 As shown, the present invention is a meta-learning algorithm based on the gradual gradient correction of the meta-learner, which is used to train the classifier on the training data with noisy labels. Great for real data scenarios with noisy markers. First, obtain a training dataset with noisy labels and a small amount of clean and unbiased metadata set; compared to the classifier (student network) built on the training dataset, build a meta-learner (teacher network) on the metadata set ; jointly update the student network and teacher network parameters using stochastic gradient descent. That is, the student network parameter gradient update function is obtained through the student network gradient descent format; it is fed back to the teacher network, and the teacher network parameter update is obtained by using the me...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a meta-learning algorithm based on stepwise gradient correction of a meta-learner, and the algorithm comprises the steps: firstly, obtaining training data with noise marks anda small amount of clean unbiased metadata sets; establishing a meta-learner, namely a teacher network, on the metadata set relative to a classifier, namely a student network established on the training data set; and carrying out united updating of student network parameters and teacher network parameters by using random gradient descent; obtaining a student network parameter gradient update function through a student network gradient descent format; feeding the network parameters back to the teacher network, and updating the teacher network parameters by using metadata to obtain a corrected student network parameter gradient format; and then updating the student network parameters by using the correction format. Accordingly, the student network parameters can achieve better learning in thecorrection direction, and the over-fitting problem of noise marks is weakened. The method has the characteristics of easiness in understanding, realization, interpretability and the like of a user, and can be robustly suitable for an actual data scene containing noise marks.

Description

technical field [0001] The invention belongs to the technical field of deep neural networks, and relates to a meta-learning algorithm, in particular to a meta-learning algorithm based on gradual gradient correction of a meta-learner. Background technique [0002] Deep neural networks have recently achieved remarkable results in different applications due to their powerful modeling capabilities for complex input patterns. Nevertheless, deep neural networks are prone to overfitting on training data containing noisy annotations, resulting in poor generalization during prediction. In practice, this robust learning problem for noisy annotations is often unavoidable due to the scarcity of high-quality annotations. Typical examples, such as using crowdsourcing systems or search engines for data collection, often generate a large number of wrong labels, resulting in low-quality training data. Therefore, effective learning of data containing noisy labels is a very important and cha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
Inventor 孟德宇束俊徐宗本
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products