Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Loss function optimization method and device of classification model and sample classification method

A loss function, classification model technology, applied in text database clustering/classification, unstructured text data retrieval and other directions, can solve problems such as the decline of generalization ability, improve learning weight, reduce generalization, and improve accuracy Effect

Active Publication Date: 2019-05-07
ZHONGKE DINGFU BEIJING TECH DEV
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The embodiment of the present application provides a classification model loss function optimization method, device and sample classification method to solve the problem of the generalization ability of the classification model in the prior art due to the indiscriminate learning of classification features.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Loss function optimization method and device of classification model and sample classification method
  • Loss function optimization method and device of classification model and sample classification method
  • Loss function optimization method and device of classification model and sample classification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the accompanying drawings in the embodiments of the present application. Obviously, the described The embodiments are only a part of the embodiments of the present application, but not all of the embodiments. Based on the embodiments in the present application, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the scope of protection of the present application.

[0028] Before explaining the technical solutions of the embodiments of the present application in detail, firstly, a specific description will be given of the technical scenarios to which the technical solutions of the embodiments of the present application can be applied.

[0029] In the field of natural langu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a loss function optimization method and device of a classification model and a sample classification method. The optimization method comprises: generating a filter vector corresponding to a classification label vector, wherein the classification label vector and the filter vector both comprise a dimension corresponding to a first type of classification anda dimension corresponding to a second type of classification, and a dimension value corresponding to the second type of classification in the filter vector is zero; Generating an original loss function according to the classification label vector and the output result of the classification model; Filtering the original loss function by using the filter vector to remove a component of a second class of classification in the original loss function to obtain a loss filtering function; And performing post-processing on the loss filtering function according to a preset rule to obtain a loss optimization function. Therefore, the optimized loss function can improve the learning weight of the classification model to the text features of the first classification, does not learn the text features ofthe second classification, reduces the generalization of the classification model, and improves the text classification accuracy.

Description

technical field [0001] The present application relates to the technical field of natural language processing, in particular to a method and device for optimizing a loss function of a classification model, and a sample classification method. Background technique [0002] In the field of natural language processing technology, the TextCNN (TextConvolutional Neural Network, text convolutional neural network) model is a mainstream solution for text classification models. The principle of the TextCNN model is to convolve text features through a convolutional neural network. During the convolution process, multiple convolution kernels are used to extract important text features from the text, and text classification is performed based on the extracted text features. [0003] In the prior art, the TextCNN model has no difference in the feature extraction of the input text, that is to say, every time each text enters the TextCNN model, the TextCNN model will learn the text once and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35
Inventor 秦海宁李文李士勇张瑞飞李广刚
Owner ZHONGKE DINGFU BEIJING TECH DEV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products