Training sample annotation cost reduction method for transfer learning
A technology of training samples and transfer learning, applied in the field of transfer learning of artificial intelligence, can solve the problems that the common feature space affects the final model learning effect, cannot truly represent the target task samples, and affects the target task learning, etc., and achieves the quality of the labeled sample set. The effect of good, increased labeling cost, and decreased labeling cost
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0039] In this embodiment, the technical solution provided by the present invention is applied to obtain labeling data for model training for a machine vision task of recognizing pictures of cats and dogs.
[0040] Preparation and instructions before implementing the program:
[0041] Source task: CIFAR-10 classification.
[0042] The source task has annotated sample set: the open dataset CIFAR-10, there are 10 different object categories (including cats and dogs), and each category has 6000 color images of size 32*32.
[0043] Source task model: A 10-classification model trained on the CIFAR-10 dataset. The model is a deep neural network model with VGG16 structure.
[0044] Objective task: To accurately determine whether the animal in an input picture from an Internet pet website is a cat, a dog, or something else (divided into 3 categories).
[0045] Target task unlabeled sample set: 50,000 colored pet pictures of different sizes downloaded from the pet website, most of w...
Embodiment approach
[0051] start the implementation (e.g. figure 2 shown):
[0052] (1) The pictures of cats and dogs in the labeled sample set of the source task are taken out to form an alternative sample set S. Another feasible approach is to remove the non-animal categories (airplanes, cars, boats, trucks) in the labeled sample set of the source task, and then change the categories of non-cats and dogs (birds, deer, frogs, horses) to "Other" uniformly. Alternative sample set S.
[0053] (2) Use the common feature mapping function to map all samples in the surrogate sample set to the common feature space. That is, each sample in the replacement sample set is used as the input of the source task model, and then the corresponding output is extracted from the output of the feature extraction network module of the source task model, and all the outputs form a new set.
[0054] (3) Select N=ρ*T samples from the target task unlabeled sample set to form the target task sample set U to be labeled....
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More - R&D
- Intellectual Property
- Life Sciences
- Materials
- Tech Scout
- Unparalleled Data Quality
- Higher Quality Content
- 60% Fewer Hallucinations
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2025 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com


