Transfer learning method and device

A transfer learning and sample technology, applied in the field of machine learning, can solve problems affecting system performance, sample quality deterioration, algorithm performance degradation, etc., to achieve the effect of improving sample quality and accuracy

Active Publication Date: 2015-05-13
HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
View PDF4 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the early stage of the entire learning process, due to the expansion of the training set by automatically labeled samples, the performance index of the algorithm will increase, but with the accumulation of misclassified samples, the quality of the samples will deteriorate, and the performance of the algorithm will decline in the middle and late stages of the learning process.
For practical tasks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Transfer learning method and device
  • Transfer learning method and device
  • Transfer learning method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0021] Such as figure 1 As shown, the transfer learning method of this embodiment includes steps S10-S40.

[0022] Step S10 is an initial step, in which parameters related to transfer learning are set and initialized. For example, set and initialize the input parameters of transfer learning, including labeled source distribution data L, unlabeled target distribution data U, and automatically labeled data sets TS of past cycles c = φ, the automatic labeling data set TS in the current cycle l = φ, the iteration period T for error detection, the total number of transfer learning iterations (referred to as the total number of iterations) K, the number of positive and negative samples automatically marked in each iteration p and q, the number of current iterations I, the error estimated in the past cycle Boundary ε pre , the error bound ε of the current cycle estimate next ,wait.

[0023] Step S20 is a sample acquisition step, that is, starting the migration learning iteration...

Embodiment 2

[0084] The migration learning method of this embodiment is basically the same as that of Embodiment 1, the difference is that in the cycle calculation step, Embodiment 1 uses the KNN graph model method based on statistics to calculate the error rate, while this embodiment uses the method based on cross-validation classification method. Specifically, the cycle calculation step of this embodiment includes: taking the automatically labeled data after each iteration as a sample, and dividing all samples in the current iteration cycle into at least two sets, one of which is used as a test set, and the remaining sets are used as The training set is calculated by using the cross-validation method to obtain the classification error probability of each sample, which is equivalent to the error rate of embodiment 1, and then according to the calculated classification error probability of each sample in the current iteration cycle, calculate the current The error bound of the iteration cy...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a transfer learning method and device. The method comprises setting relevant parameters of transfer learning and performing initialization; starting transfer learning iteration to obtain automatic annotation data; when the number of the iteration meets an iteration cycle, performing error detection on the automatic annotation data serving as samples in the iteration cycle to determine the sample relative quality of the iteration cycle; according to the sample relative quality, determining whether to delete or retain the samples and determining whether to continue the transfer learning iteration; when the transfer learning iteration is stopped, outputting the retainer samples and a transfer classifier. The transfer learning method has the advantage that, during a transfer learning process, the learning process is divided according to the transfer cycle, error detection is performed every full cycle, the samples are screened according to the sample relative quality determined through the error detection, so that low-quality samples can be eliminated to achieve the aim of improving the sample quality of automatic annotation data during the transfer learning process and further to improve the accuracy of a system applying the transfer learning method.

Description

technical field [0001] The present invention relates to the field of machine learning, in particular to a transfer learning method and device. Background technique [0002] In traditional classification learning, in order to ensure the accuracy and high reliability of the trained classification model, there are two basic assumptions: (1) the training samples used for learning and the new test samples satisfy the condition of independent and identical distribution; (2) There must be enough available training samples to learn a good classification model. However, in practical applications, it is found that these two conditions are often not satisfied. However, transfer learning, an important branch of machine learning, relaxes two basic assumptions in traditional machine learning. Migration learning is mainly aimed at obtaining knowledge training related models from the resource-rich source domain and the target domain, and then solving the problem of the target domain with ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/66G06F17/30
CPCG06F18/217G06F18/214
Inventor 桂林徐睿峰陆勤周俞
Owner HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products