Unlock instant, AI-driven research and patent intelligence for your innovation.

A Collaborative Training Method Based on Consistency Judgment of Unlabeled Samples

A collaborative training, unlabeled technology, applied in the field of multi-view learning, can solve the problems of multi-angle remote sensing increasing the difficulty of joint analysis of ground objects in the same area, low classification accuracy of multi-angle remote sensing images, etc. The effect of improving classification accuracy

Active Publication Date: 2021-12-31
HARBIN INST OF TECH
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to solve the problem that the existing multi-angle remote sensing increases the difficulty of joint analysis of ground objects in the same area, especially the difficulty of analyzing the change of ground objects, which makes the classification accuracy of multi-angle remote sensing images low, and proposes a classifier-based A collaborative training method for unlabeled sample consistency determination (CO-training with Unlabeled Sample's Consistency hereinafter referred to as CO-USC)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Collaborative Training Method Based on Consistency Judgment of Unlabeled Samples
  • A Collaborative Training Method Based on Consistency Judgment of Unlabeled Samples
  • A Collaborative Training Method Based on Consistency Judgment of Unlabeled Samples

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0025] Specific implementation mode 1: The specific process of a collaborative training method based on unlabeled sample consistency judgment in this implementation mode is as follows:

[0026] Consistency judgment is: compare the difference in classification performance of classifiers adding unlabeled samples, and determine the confidence of unlabeled samples by comparing the consistency of unlabeled samples before and after adding unlabeled samples to the classifier; based on a simple idea, if a A sample and a label are added to the training sample of the classifier. If the classification effect of the classifier is exactly the same, it can be considered that the sample corresponds to the label completely. In other words, the classifier adds a new training sample and label. , the closer the classification performance is, the higher the confidence between the sample and the corresponding label will be; however, this idea is of limited value for ordinary single-view samples, al...

specific Embodiment approach 2

[0038] Embodiment 2: This embodiment differs from Embodiment 1 in that: the classifier in step 1 is a supervised or semi-supervised classifier.

[0039] Other steps and parameters are the same as those in Embodiment 1.

specific Embodiment approach 3

[0040] Embodiment 3: The difference between this embodiment and Embodiment 1 or 2 is that in step 2, the confidence and pseudo-label of the unlabeled sample in the captured image are determined, and according to the confidence and pseudo-label of the unlabeled sample in the captured image, The label selects the trusted samples of the unlabeled samples in the captured image; the specific process is:

[0041] Step two one:

[0042] Take the first perspective as the main perspective, and the rest of the perspectives as non-main perspectives,

[0043] Perform USC judgment in the non-main view to obtain the USC sequence of the non-main view. The number of USC sequences of the non-main view is N-1. The USC sequences of the non-main view are superimposed, and the unlabeled samples of the main view are obtained according to the superimposed sequence. USC confidence (the lower the value of the superimposed sequence, the higher the USC confidence); (USC judgment is performed in the non...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A collaborative training method based on unlabeled sample consistency judgment, the invention relates to collaborative training and multi-angle image classification. The purpose of the present invention is to solve the problem that the existing multi-angle remote sensing increases the difficulty of the joint analysis of the ground objects in the same area, especially the difficulty of the analysis of the change of the ground objects, which makes the classification accuracy of the multi-angle remote sensing images low. The process is as follows: 1: Carry out preliminary classification; 2: Select credible samples of unlabeled samples in the captured image; 3: Obtain a retrained classifier for this view; until the retraining of classifiers corresponding to all views is completed; 4: Obtain the The classification results of the perspectives; until the classifiers corresponding to all perspectives are reclassified; 5: Repeat steps 2, 3, and 4 until the iteration termination conditions are met, and the classification results of each perspective are obtained, and voting is performed, and the label with the highest voting rate is used as the Labels for unlabeled samples in one captured image. The invention is used in the field of digital image processing.

Description

technical field [0001] The invention belongs to the field of digital image processing, relates to collaborative training and multi-angle image classification, and is a multi-angle learning method. Background technique [0002] Feature data obtained from different levels of the same object or obtained from different channels are generally called multi-view data. Multi-perspective learning usually needs to follow two principles: the principle of consistency and the principle of complementarity; the principle of consistency means that different perspectives of the same object are related to each other, and the principle of complementarity means that different perspectives of the same object are different and can be as complementary features. Existing multi-view learning algorithms are mainly divided into the following three categories: collaborative training, subspace learning and multi-kernel learning. Among them, the collaborative training algorithm learns two or more diffe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62
CPCG06F18/2155G06F18/24
Inventor 谷延锋李天帅
Owner HARBIN INST OF TECH