Small sample learning method for external memory and meta-learning based on attention guidance

A learning method and attention technology, applied in the field of small sample learning of external memory and meta-learning, can solve the problem of insufficient number of samples

Pending Publication Date: 2020-09-22
GUANGDONG UNIV OF PETROCHEMICAL TECH
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In the field of machine learning, the problem of insufficient samples has gradually become prominent. At present, in machine learning, there are technologies such as attention mechanism, memory mechanism, meta-learning, and variational reasoning, but the combination of attention mechanism and external memory mechanism is the first The prior knowledge memory is combined with the meta-learning framework and variational reasoning to build a recognizer, which effectively uses the prior knowledge memory combined with the attention mechanism and external memory to make up for the impact of small samples. not seen

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Small sample learning method for external memory and meta-learning based on attention guidance
  • Small sample learning method for external memory and meta-learning based on attention guidance
  • Small sample learning method for external memory and meta-learning based on attention guidance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0097] A few-shot learning method based on attention-guided external memory and meta-learning, including the following steps:

[0098] Input: optimized network parameters Θ={θ 1 , θ 2 , θ 3 , φ}, the supporting data set S={x under the new task j ,y j} and queryset

[0099] Output: the prediction results of the query set

[0100] S1. For the supporting data set and query set data, calculate their characteristics and key index values ​​as follows:

[0101]

[0102] S2. For category c in the classifier, perform the following calculations:

[0103]

[0104]

[0105]

[0106] S3. Construct the overall classifier as follows:

[0107] W=[w 1 ,...,w c ,...,w C ]

[0108] b=[b 1 ,...,b c ,...,b C ];

[0109] S4. Calculate attention weight ω q,i and the reference value v obtained from the memory mechanism according to the attention q :

[0110]

[0111]

[0112] S5. For the query set, calculate its characteristic expression as follows:

[0113] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a small sample learning method for external memory and meta-learning based on attention guidance, and belongs to the field of small-sample learning methods. According to the small sample learning method for external memory and meta-learning based on attention guidance, an inference network, an attention mechanism and a memory mechanism are combined for the first time; smallsample learning is carried out based on a meta-learning framework; an attention mechanism and an external memory mechanism are combined as a priori knowledge memory; a meta-learning framework and variational reasoning are combined to construct a recognizer; a priori knowledge memory combining an attention mechanism and external memory is effectively utilized; the influence caused by small samplesis compensated; moreover, by jointly learning an embedded function, mapping the representation to a potential space, storing the potential space as a value in an external memory, taking the content in the external memory as priori knowledge, and selecting similar knowledge from the external memory when a small sample arrives, the performance of the classifier under the small sample is improved.

Description

technical field [0001] The present invention relates to the field of few-shot learning methods, and more particularly, relates to a few-shot learning method based on attention-guided external memory and meta-learning. Background technique [0002] In the field of machine learning, with the emergence of more application scenarios, the problem of insufficient samples has gradually become prominent. Therefore, how to learn with small samples has become an important research direction. The current mainstream methods of learning in the case of small samples include: [0003] One is the small-sample learning method based on the meta-learning framework: through the meta-learning framework, the classifier has better generalization ability, so that it can obtain better performance under small samples; for example, Jamal M A et al. proposed a new Task-agnostic meta-learning (TAML) algorithm to avoid biased meta-learning to improve its generalization ability. [0004] The second is t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N20/00G06F16/245G06F16/22
CPCG06N20/00G06F16/2228G06F16/245G06F18/214
Inventor 张磊甄先通李欣左利云陈林凯陈宏琼蔡泽涛
Owner GUANGDONG UNIV OF PETROCHEMICAL TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products