Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model training method, text classification method and related devices

A technology for model training and training samples, applied in the field of machine learning, can solve problems such as poor model training effect, weakly supervised learning training samples, and cost of large manpower and material resources, so as to reduce learning label errors and inaccurate sample information, Improve the effect of training

Pending Publication Date: 2019-11-19
NEW H3C BIG DATA TECH CO LTD
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, in the actual training process, since it takes a lot of manpower and material resources to label the training samples, there are always a small number of labeled training samples. Generally, more training samples are not labeled with labels, or are labeled with rough labels. Granular labels, or labeled labels are not necessarily all true values; for this reason, weakly supervised learning is generally used to overcome the fact that there are few training samples labeled with true value labels
[0004] However, due to weakly supervised learning, such as inaccurate supervision (inaccurate supervision) learning, since there are a large number of training samples during training, the labels marked may not be true, and the model cannot learn the correct features of the training samples well, thus The training effect of the model is often poor when it leads to inaccurate supervised learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method, text classification method and related devices
  • Model training method, text classification method and related devices
  • Model training method, text classification method and related devices

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] In order to make the purpose, technical solutions and advantages of the embodiments of this application clearer, the technical solutions in the embodiments of this application will be described clearly and completely in conjunction with the drawings in the embodiments of this application. Obviously, the described embodiments It is a part of the embodiments of this application, but not all the embodiments. The components of the embodiments of the present application generally described and shown in the drawings herein may be arranged and designed in various different configurations.

[0042] Therefore, the following detailed description of the embodiments of the present application provided in the accompanying drawings is not intended to limit the scope of the claimed application, but merely represents selected embodiments of the present application. Based on the embodiments in this application, all other embodiments obtained by those of ordinary skill in the art without cr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a model training method, a text classification method and related devices, and relates to the technical field of machine learning. According to the invention, when the inaccurate supervised learning is carried out on the neural network model by utilizing the training sample with accurate label labeling and the training sample with inaccurate label labeling, a respective corresponding proportion parameter for each training sample is determined; a weight parameter vector representing the accurate labeling probability of all training sample labels is obtained; the feature vectors corresponding to all the training samples are learned by using the weight parameter vectors; proportional output vectors are obtained, a vector based on the proportion is outputted. Compared with the prior art, the model parameters of the neural network model are updated, so that the neural network model can learn more accurate sample information marked by the label when learning is not accurately supervised. The inaccurate sample information marked by the learning label is reduced. The training effect of the neural network model when learning is not accurately supervised is improved.

Description

Technical field [0001] This application relates to the field of machine learning technology, and specifically to a model training method, a text classification method and related devices. Background technique [0002] Supervised learning refers to training a neural network model using labeled training samples, where the labels labeled by the training samples represent the true output of the training samples. [0003] However, in the actual training process, because labeling training samples requires a lot of manpower, material resources and other costs, there are always a small number of labeled training samples. Generally, more training samples are not labeled with labels, or labeled with thick labels. The granular labels or the labeled labels are not necessarily all true values; for this reason, weakly supervised learning is generally used to overcome the situation that there are fewer training samples labeled with true value labels. [0004] However, due to weakly supervised lear...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/044G06N3/045G06F18/23213G06F18/24G06F18/214
Inventor 王李鹏
Owner NEW H3C BIG DATA TECH CO LTD
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More