Image classification method based on adversarial fusion multi-source transfer learning

A technology of transfer learning and classification methods, which is applied in the field of image classification and image classification under the absence of training data labels, which can solve the problems of source domain data losing effective features and affecting classification effects, so as to improve classification accuracy and classification accuracy Effect

Active Publication Date: 2020-10-02
XIDIAN UNIV
View PDF4 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Although the above method can realize the image classification work under the absence of target domain data labels, it uses the same network to extract features from the source domain data, resulting in the loss of some effective features of the source domain data, which affects the final classification effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image classification method based on adversarial fusion multi-source transfer learning
  • Image classification method based on adversarial fusion multi-source transfer learning
  • Image classification method based on adversarial fusion multi-source transfer learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The embodiments and effects of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0032] refer to figure 1 , where S 1 ... S N Represents N source domains, T represents the target domain, F and F 1 ...F N Denote domain-shared subnetworks and N domain-specific subnetworks respectively, D 1 ...D N Denotes N domain discriminators, C 1 ...C N Represents N classifiers, and the specific implementation steps are as follows:

[0033] Step 1. Establish a domain-shared subnetwork F and a domain-specific subnetwork F j Constituted feature extraction network.

[0034] The domain sharing subnetwork F is a residual neural network ResNet50 proposed by He Yuming et al. The network consists of a convolutional layer followed by 4 residual blocks, aiming to extract the underlying features shared by all domains;

[0035]There are a total of N domain-specific sub-networks. Each sub-network is a multi-layer neural networ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an image classification method based on adversarial fusion multi-source transfer learning, and mainly solves the problem of low image classification accuracy in the prior art.According to the implementation scheme, the method comprises: 1) establishing a feature extraction network, and extracting image features from an original image file; 2) inputting the image features into a specific domain discriminator and a classifier, and calculating to obtain domain discrimination loss, a pseudo mark of target domain data and classification loss of source domain data; 3) calculating to obtain the sum of MMD distances of all classes in the source domain and the target domain by utilizing the target domain sample pseudo-mark and the source domain sample mark; 4) training thefeature extraction network, the domain discriminator and the classifier by using the sum of the domain discrimination loss, the classification loss and the MMD distance; and 5) sequentially inputtingthe sample to be detected into the trained feature extraction network, the domain discriminator and the classifier, and outputting the category marker of the sample to be detected. The method can effectively improve the classification accuracy of various images, and can be used for image classification under the condition of missing training data markers.

Description

technical field [0001] The invention belongs to the field of image recognition, and in particular relates to an image classification method, which can be used for image classification under absence of training data marks. Background technique [0002] Transfer learning is to "transfer" the knowledge and experience learned in one field to another different but related field to improve the learning efficiency of the model without having to start learning again. Generally, the domain to be classified or predicted is called the "target domain"; the auxiliary domain with a large amount of labeled data is called the "source domain", and there are domain differences between the two. Using transfer learning to study image classification problems has achieved remarkable results at home and abroad. Existing transfer learning methods can be classified into sample-based, feature-based and model-based methods. [0003] Inspired by the two-person zero-sum game in game theory, some schol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06N3/04G06N3/08
CPCG06N3/08G06V10/40G06N3/045G06F18/24
Inventor 方敏徐筱杜辉胡心钰李海翔郭龙飞
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products