Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method for classification model, and device and computer server thereof

A classification model and classifier technology, applied in the field of deep learning, can solve the problems of narrow application scope and low calculation efficiency of classification models, and achieve the effect of wide application range and improved training efficiency.

Active Publication Date: 2018-10-16
BEIJING TUSEN ZHITU TECH CO LTD
View PDF8 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the above problems, the present invention provides a classification model training method and its device, and a computer server to solve the technical problems of low computational efficiency and narrow scope of application in the prior art through semi-supervised learning techniques for training classification models

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method for classification model, and device and computer server thereof
  • Training method for classification model, and device and computer server thereof
  • Training method for classification model, and device and computer server thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] see figure 1 , which is a flowchart of a training method for a classification model in an embodiment of the present invention, the method includes:

[0036] Step 101, construct an initial classification model, the initial classification model includes at least one single-modal classification model with the same classification task, wherein the modal data training set corresponding to each single-modal classification model includes labeled training data and unlabeled training data data;

[0037] Step 102, based on the method of aligning the feature encoding distribution of the labeled training data and the unlabeled training data in the modal data training set of each unimodal classification model, using the modal training data set pair corresponding to each unimodal classification model The initial classification model is trained to obtain a target classification model.

[0038] In the embodiment of the present invention, each unimodal classification model in the init...

example 1

[0041] In Example 1, the structure of the initial classification model can be as follows figure 2 Shown contains only one unimodal classification model, it can also be shown as image 3 shown contains more than two unimodal classification models, whether figure 2 still image 3 As shown in the structure, each unimodal classification model includes a feature encoder and a classifier and a discriminator respectively cascaded with the feature encoder, and the discriminator is used to judge that the feature encoding output by the feature encoder comes from Labeled training data or unlabeled training data, the output of the discriminator is provided with a first loss function for training the discriminator and a second loss function for training the feature encoder, the The first loss function and the second loss function adversarial settings.

[0042] In this example 1, the step 102 uses the modal training data sets corresponding to each unimodal classification model to train...

example 2

[0054] The structure of setting up the initial classification model can be as follows Figure 7 As shown, each unimodal classification model includes a feature encoder and a classifier and a discriminator cascaded with the feature encoder respectively, and the discriminator is used to judge that the feature encoding output by the feature encoder comes from a labeled Training data or unlabeled training data, the output of the discriminator is provided with a first loss function for training the discriminator and a second loss function for training the feature encoder, the first The loss function and the second loss function confrontation setting; and the feature encoders of multiple unimodal classification models are also respectively connected to the same cross-modal discriminator, and the cross-modal discriminator is used to distinguish each unimodal classification model The modal type corresponding to the feature encoding output by the feature encoder of the feature encoder,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a training method for classification models, and a device and a computer server thereof, so as to solve the technical problems that the prior art is low in calculation efficiency and narrow in application range by a semi-supervised learning technology training classification model. The method comprises steps of: constructing an initial classification model, wherein the initial classification model comprises at least one single-mode classification models comprising same classification tasks, and a mode data training set corresponding to each single-mode classification models comprising label training data and label-free training data; and training the initial classification models to obtain target classification models, based on a method of the feature coding distribution that aligning the label training data and the label-free training data in the mode data training set of each single-mode classification models. According to the training method for classification models, the training efficiency of the classification model can be improved, and the application range is wider.

Description

technical field [0001] The invention relates to the field of deep learning, in particular to a classification model training method, a classification model training device and a computer server. Background technique [0002] At present, training a neural network usually requires a large amount of labeled sample data. First, it is necessary to collect a large amount of sample data, and then manually label the collected sample data to obtain labeled sample data for training the neural network. Collection and labeling require high Human cost and time cost. [0003] In order to solve this technical problem, the neural network is currently trained using a training data set containing labeled training data and unlabeled training data, which does not require a large amount of labeled training data, thereby alleviating the dependence on a large number of labeled data to solve the existing problems. The technical labeling sample data cost and time cost are high. [0004] The existi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/217G06F18/214
Inventor 王乃岩樊峻崧
Owner BEIJING TUSEN ZHITU TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products