A Multi-label Classification Learning Method Based on Matching Learning
A learning method and multi-label technology, applied in the field of extremely large-scale multi-label classification learning, can solve problems such as not being able to handle long-tail labels well, and achieve the effect of supporting online incremental learning and ensuring parallel learning
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0054] This embodiment provides a very large-scale multi-label classification learning method based on matching learning, combining figure 1 As shown, the specific steps are as follows:
[0055] Step 1. Collect user data on the Internet, including user tags.
[0056] Step 2. Extract features from data such as user text and images, and perform feature value calculation. In order to obtain the data set D={(x 1 ,w 1 ,y 1 )...(x n ,w n ,y n )}. Where x is the feature set, w is the corresponding feature value set, and y is the label set.
[0057] Step 3. Randomly sample a mini-batch from the data set for gradient descent mini-batch, and prepare to optimize the parameters of the multi-label model. The specific steps are as follows:
[0058] Step 301. Perform random shuffling on the data set D.
[0059] Step 302, traverse the shuffled data set with a step size M, and generate a mini-batch D at each step m .
[0060] Step 303, for D m Randomly sample N sets of negative la...
Embodiment 2
[0078] This embodiment provides a multi-label classification learning method based on matching learning, including the following steps:
[0079] S1: Collect client data in the Internet, perform feature value calculation on the client data, and obtain a training set D;
[0080] S2: traverse the training set D, and set the negative label set and positive label set in the training set D;
[0081] S3: Calculate the embedded representation E of the feature set in the training set D;
[0082] S4: Calculate the embedded representation Z of the positive label set + and the embedded representation Z of the set of negative labels - ;
[0083] S5: Comparing the embedded representation E and the embedded representation Z + Perform loss calculation to obtain a positive label loss value, for the embedded representation E and the embedded representation Z - Calculate the loss to get the negative label loss value;
[0084] S6: According to the positive label loss value and the negative ...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


