Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Knowledge distillation method, device and system for pre-trained language model BERT

A technology of language model and distillation method, which is applied in the field of knowledge distillation for the pre-trained language model BERT, can solve the problems of BERT with many parameters, difficult engineering deployment, complex structure, etc., and achieve small parameters, simple structure, and easy to obtain in large quantities Effect

Pending Publication Date: 2021-02-09
BEIJING UNISOUND INFORMATION TECH +1
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] One or more embodiments of this specification describe the knowledge distillation method, device and system for the pre-trained language model BERT, which can solve the problems existing in the current technology that the pre-trained language model BERT has many parameters, complex structure, and difficult engineering deployment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Knowledge distillation method, device and system for pre-trained language model BERT
  • Knowledge distillation method, device and system for pre-trained language model BERT
  • Knowledge distillation method, device and system for pre-trained language model BERT

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, rather than to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0045] It should be noted that in other embodiments, the steps of the corresponding methods may not necessarily be performed in the order shown and described in this specification. In some other embodiments, the method may include more or less steps than those described in this specification. In addition, a single step described in this specification may be decomposed into multiple steps for description in other embodiments; multiple steps described in this specification may also be combined into a single step in other embodiments describe.

[0046] In the case...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a knowledge distillation method, device and system for a pre-trained language model BERT. The knowledge distillation method comprises a distillation training stage and a fine tuning test stage. The distillation training stage comprises the following steps of obtaining a label-free text, utilizing BERT model coding to obtain a BERT sentence vector of the label-free text, acquiring a BiLSTM sentence vector of the label-free text by using BiLSTM model coding, and based on the BERT sentence vector and the BiLSTM sentence vector, training to obtain an optimal BiLSTM model. The fine tuning test stage comprises the following steps of inputting a labeled data set into an optimal BiLSTM model, and carrying out fine tuning training, and inputting the test set into the fine-tuned BiLSTM model, and calculating an output result. According to the method, a method of distillation first and fine adjustment later is adopted, and the problems that a pre-trained language model is large in BERT parameter number, complex in structure and difficult to deploy in engineering are solved.

Description

technical field [0001] One or more embodiments of the present invention relate to the technical field of data processing, and in particular to a knowledge distillation method, device and system for a pre-trained language model BERT. Background technique [0002] This section is intended to provide a background or context for implementations of the invention that are recited in the claims. The descriptions herein may include concepts that could be explored, but not necessarily concepts that have been previously thought of or explored. Therefore, unless otherwise indicated herein, what is described in this section is not prior art to the description and claims in this application and is not admitted to be prior art by inclusion in this section. [0003] With the development of artificial intelligence recognition, models are widely used for data processing, image recognition, etc., while the BERT model is a pre-trained language model that uses large-scale unlabeled corpus trai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/211G06N3/04G06N3/08
CPCG06F40/211G06N3/049G06N3/08G06N3/084G06N3/045
Inventor 姜珊
Owner BEIJING UNISOUND INFORMATION TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products