Few-sample learning method based on sample-level attention network

A sample learning and attention technology, applied in neural learning methods, biological neural network models, instruments, etc., can solve problems such as high cost, large amount of labeled data in deep network models, and inability to use deep learning

Active Publication Date: 2020-11-24
FUZHOU UNIV
View PDF4 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In recent years, with the help of deep networks, computers have surpassed humans in image recognition tasks, but deep networks also have three problems: First, training deep network models requires a large amount of labeled data, so most image recognition technologies are Focus on object recognition with large-scale datasets
Second, in many scenarios in reality, there is no large amount of labeled data, or a lot of labeled data requires domain experts to label, and the cost of obtaining large-scale labeled data sets is too high, which makes it impossible to use deep learning in many fields.
Third, deep learning requires a large amount of data and multiple iterations to train a model, but when faced with the emergence of new categories, it needs to be retrained
[0008] However, the above two methods have disadvantages, that is, the network needs to be fine-tuned when facing new few-sample learning tasks.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Few-sample learning method based on sample-level attention network
  • Few-sample learning method based on sample-level attention network
  • Few-sample learning method based on sample-level attention network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0040] It should be pointed out that the following detailed description is exemplary and is intended to provide further explanation to the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.

[0041] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combinati...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a few-sample learning method based on a sample-level attention network. The method comprises the steps of respectively inputting a sample of a support set and a sample of a query set into a feature extraction module, and obtaining a corresponding support set feature vector and a query set feature vector; inputting the support set feature vector corresponding to each classinto a sample-level attention network module to obtain a class prototype of each class; calculating the distance between the query set feature vector and each class prototype to obtain the probabilitydistribution of the class to which the query set feature vector belongs; jointly training the feature extraction module and the sample-level attention network module by adopting cross entropy loss ofthe query set and classification loss of the support set, and obtaining a gradient updating network through back propagation. According to the method, a new target task can be solved only by learninga large number of similar learning tasks, and for the target task, model updating is not needed any more.

Description

technical field [0001] The invention relates to the technical field of sample image classification, in particular to a few-sample learning method based on a sample-level attention network. Background technique [0002] In recent years, with the help of deep networks, computers have surpassed humans in image recognition tasks, but deep networks also have three problems: First, training deep network models requires a large amount of labeled data, so most image recognition technologies are Focus on object recognition with large-scale datasets. Second, in many scenarios in reality, there is no large amount of labeled data, or a lot of labeled data needs to be labeled by domain experts, and the cost of obtaining large-scale labeled datasets is too high, which makes it impossible to use deep learning in many fields. Third, deep learning requires a large amount of data and multiple iterations to train a model, but when faced with the emergence of new categories, it needs to be ret...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06N3/08
CPCG06N3/084G06V10/40G06F18/217G06F18/214G06F18/24
Inventor 于元隆赵晓南
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products