Loss function optimization method, device and sample classification method for classification model

A loss function and text classification technology, which is applied in the loss function optimization of classification models and the field of sample classification, can solve problems such as the decline of generalization ability, and achieve the effect of improving learning weight, reducing generalization, and improving accuracy

Active Publication Date: 2021-04-27
ZHONGKE DINGFU BEIJING TECH DEV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The embodiment of the present application provides a classification model loss function optimization method, device and sample classification method to solve the problem of the generalization ability of the classification model in the prior art due to the indiscriminate learning of classification features.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Loss function optimization method, device and sample classification method for classification model
  • Loss function optimization method, device and sample classification method for classification model
  • Loss function optimization method, device and sample classification method for classification model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to enable those skilled in the art to better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application. Obviously, the described The embodiments are only some of the embodiments of the present application, but not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without creative efforts shall fall within the protection scope of this application.

[0028] Before explaining the technical solutions of the embodiments of the present application in detail, firstly, a specific description will be given of the technical scenarios to which the technical solutions of the embodiments of the present application can be applied.

[0029] In the field of natural language processing technology, clas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Embodiments of the present application provide a method and device for optimizing a loss function of a classification model, and a sample classification method. Wherein, the optimization method includes: generating a filter vector corresponding to the classification label vector, the classification label vector and the filter vector both include the dimension corresponding to the first classification and the dimension corresponding to the second classification, and the second classification in the filter vector The corresponding dimension value is zero; the original loss function is generated according to the classification label vector and the output of the classification model; the original loss function is filtered using the filter vector to remove the second-class classification components in the original loss function, and the loss filter is obtained function; according to the preset rules, the loss filtering function is post-processed to obtain the loss optimization function. Therefore, the optimized loss function can increase the learning weight of the classification model for the text features of the first category, and does not learn the text features of the second category, reducing the generalization of the classification model and improving the accuracy of text classification.

Description

technical field [0001] The present application relates to the technical field of natural language processing, in particular to a method and device for optimizing a loss function of a classification model, and a sample classification method. Background technique [0002] In the field of natural language processing technology, the TextCNN (TextConvolutional Neural Network, text convolutional neural network) model is a mainstream solution for text classification models. The principle of the TextCNN model is to convolve text features through a convolutional neural network. During the convolution process, multiple convolution kernels are used to extract important text features from the text, and text classification is performed based on the extracted text features. [0003] In the prior art, the TextCNN model has no difference in the feature extraction of the input text, that is to say, every time each text enters the TextCNN model, the TextCNN model will learn the text once and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/35
Inventor 秦海宁李文李士勇张瑞飞李广刚
Owner ZHONGKE DINGFU BEIJING TECH DEV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products