Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Natural language processing model training method, task execution method, equipment and system

A technology of natural language processing and model training, applied in the field of text processing, can solve the problems of lack of data enhancement methods, natural language processing ability to be improved, etc., and achieve the goal of maintaining semantic diversity, improving natural language processing ability, and good generalization ability Effect

Active Publication Date: 2020-04-28
HUAZHONG UNIV OF SCI & TECH
View PDF11 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In general, due to the lack of effective data enhancement methods, the natural language processing ability of the student model trained by knowledge distillation needs to be improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Natural language processing model training method, task execution method, equipment and system
  • Natural language processing model training method, task execution method, equipment and system
  • Natural language processing model training method, task execution method, equipment and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0041]In the present invention, the terms "first", "second" and the like (if any) in the present invention and drawings are used to distinguish similar objects, and are not necessarily used to describe a specific order or sequence.

[0042] In order to effectively enhance the data set of the natural language processing task in the knowledge distillation scenario, improve the processing capabilit...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a natural language processing model training method, a natural language processing method, natural language processing equipment and a natural language processing system, whichbelong to the field of natural language processing, and the method comprises the following steps: training a teacher model by utilizing a marked original data set; enhancing text sentences in the original data set to obtain enhanced text sentences, and labeling the enhanced text sentences by using a trained teacher model to obtain a labeled enhanced data set; taking the original data set and theenhanced data set as a training data set, training the student model, and taking the trained student model as a natural language processing model, wherein the teacher model and the student model are both deep learning models and execute the same natural language processing task, and the teacher model is more complex and larger in scale. According to the invention, the data set of the natural language processing task can be effectively enhanced in a knowledge distillation scene, and the processing capability of the natural language processing model is improved, so that the execution effect of the natural language processing task is improved.

Description

technical field [0001] The invention belongs to the field of text processing, and more specifically, relates to a natural language processing method and system in a knowledge distillation scenario. Background technique [0002] Deep learning has been widely used in the field of natural language processing in recent years. The essence of deep learning is to learn more useful features by building machine learning models with many hidden layers and massive training data, so as to ultimately improve classification or prediction. accuracy. When using a large-scale data set to train a deep learning model, in order to deal with complex data distributions, one approach is to build a complex neural network model, such as a residual network with hundreds of layers. This complex network often contains millions of Another method is to mix multiple models, train several large-scale neural networks on the same data set, and then integrate multiple models to obtain the final classificatio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/205G06F40/30G06F40/166G06N20/00
CPCG06N20/00
Inventor 王芳冯丹焦小奇
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products