Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Classification model training method, classification method, device and equipment

A classification model and model training technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problems of inaccurate classification of multi-classification models and low classification accuracy of information to be classified, achieve accurate classification and improve accuracy degree of effect

Pending Publication Date: 2020-02-11
TENCENT CLOUD COMPUTING BEIJING CO LTD
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] When multi-label classification is performed on the information to be classified, feature extraction is usually performed on the information to be classified first, and then based on the extracted features and the trained multi-classification model, the probability value of the information to be classified belongs to each category is determined, and finally the probability values ​​are combined with The corresponding preset probability values ​​are compared to determine the category of the information to be classified; however, in the above-mentioned multi-label classification process, if the trained multi-classification model has incomplete multi-label categories or wrong sample labels during the training process, it will As a result, the classification of the trained multi-classification model is inaccurate, so the accuracy of the classification of the information to be classified is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Classification model training method, classification method, device and equipment
  • Classification model training method, classification method, device and equipment
  • Classification model training method, classification method, device and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] In order to make the purpose, technical solution and advantages of the present invention clearer, the present invention will be described in further detail below in conjunction with the accompanying drawings, and the described embodiments should not be considered as limiting the present invention, and those of ordinary skill in the art do not make any All other embodiments obtained under the premise of creative labor belong to the protection scope of the present invention.

[0053] In the following description, references to "some embodiments" and "embodiments of the invention" describe a subset of all possible embodiments, but it is understood that "some embodiments" and "embodiments of the invention" may be The same subset or different subsets of all possible embodiments, and can be combined with each other without conflict.

[0054] In the following description, the term "first\second\third" is only used to distinguish similar objects, and does not represent a specif...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a classification model training method, a classification method, a device and equipment. The method comprises the following steps: continuously training an original sub-classification model by adopting a sub-classification sample until the difference between a predicted probability value of a sub-class and a sub-classification result in the sub-classification sample meets a first training cut-off condition to obtain a sub-classification model; continuously training the original total classification model by adopting the total classification sample untilthe difference between the predicted probability value of the total category and the total classification result in the total classification sample meets a second training cut-off condition to obtaina total classification model; and continuously training the original multi-classification model by adopting the multi-classification sample until the difference between the predicted probability value of the sub-class and the sub-classification result in the multi-classification sample and the difference between the predicted probability value of the total class and the total classification result in the multi-classification sample meet a third training cut-off condition to obtain the multi-classification model. Through the embodiment of the invention, the classification accuracy of the to-be-classified information can be improved.

Description

technical field [0001] The invention relates to classification technology in the field of artificial intelligence, in particular to a classification model training method, classification method, device and equipment. Background technique [0002] Information classification includes single-label classification and multi-label classification, wherein single-label classification refers to the classification in which the information to be classified corresponds to only one category label, and multi-label classification refers to the classification in which the information to be classified corresponds to multiple different category labels at the same time. Since information to be classified generally has multiple semantics, multi-label classification of the information to be classified can dig out more semantic information corresponding to the information to be classified, and the classification granularity is fine, and the classification effect is good. [0003] When multi-label...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/2415G06F18/214
Inventor 施诚彭湃刘洵余宗桥郭晓威
Owner TENCENT CLOUD COMPUTING BEIJING CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products