Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fine tuning method and device for a language model, computing equipment and storage medium

A language model and model parameter technology, applied in computing, computing model, natural language data processing and other directions, can solve problems such as few samples can not be used, low parameter efficiency, increase network parameters, etc., to reduce computing costs and reduce computing overhead. Effect

Pending Publication Date: 2021-10-01
ZHEJIANG UNIV
View PDF2 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this kind of fine-tuning method has the following defects: (1) its parameter efficiency is low: each downstream task has its own fine-tuning parameters; (2) the pre-training training target is different from the fine-tuning target, resulting in poor generalization ability of the pre-training model; (3) Compared with the pre-training stage, the network parameters are increased, and a large amount of data is required to learn the newly added parameters.
The traditional fine-tuning method adds new parameters and the training target is different from the fine-tuning target, which leads to many tasks with few samples. The traditional fine-tuning method based on the pre-training model is easy to overfit due to too few training data samples. The model may work well on the training set, but there is still a big gap between the test set and the case of using large samples. If the gap is large, it will lead to a small number of samples that cannot be used.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fine tuning method and device for a language model, computing equipment and storage medium
  • Fine tuning method and device for a language model, computing equipment and storage medium
  • Fine tuning method and device for a language model, computing equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, and do not limit the protection scope of the present invention.

[0040] figure 1 It is a flowchart of a method for fine-tuning a language model provided by an embodiment. Such as figure 1 As shown, the fine-tuning method of the language model provided by the embodiment includes the following steps:

[0041] S101. Acquire a pre-trained language model and phrases, where the phrases include discrete template prompt words and discrete tag words.

[0042] S102 Designing input data for fine-tuning the language model, the fine-tuning input data includes text sentences, template prompt words and masking tokens;

[0043] S103, perform supe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fine tuning method and device for a language model, computing equipment and a storage medium, and the method comprises the steps: obtaining a pre-trained language model and phrases, wherein the phrases comprise discrete template cue words and discrete tag words; designing input data of a fine tuning language model, wherein the fine tuning input data comprises a text statement, a template cue word and a shielding token; and performing supervised learning of a shielding token prediction task on the language model according to the input data and the tag words to optimize model parameters of the language model. The difference between the pre-trained language model and the fine-tuned language model is bridged, so the fine-tuned language model is better in performance on a downstream shielding token prediction task; under the condition of full samples, compared with a traditional fine adjustment method, the method is better in effect, under the condition of few samples, the effect of the method is improved more remarkably, the calculation cost of a large number of parameters can be reduced, and the calculation overhead of calculation equipment is reduced.

Description

technical field [0001] The invention belongs to the technical field of natural language processing, and in particular relates to a fine-tuning method, device, computing device and storage medium of a language model. Background technique [0002] The pre-trained language model is a model trained on a large corpus data set, which is obtained by pre-training a large amount of corpus. Because the pre-trained language model has used a large amount of corpus for unsupervised learning, the knowledge in the corpus has been transferred into the embedding layer of the pre-trained language model. Fine-tuning is the main method to transfer pre-training model knowledge to downstream tasks, such as a meta-knowledge fine-tuning method and platform for multi-task language models disclosed in a patent application with a publication number of CN112100383A, and a patent application with a publication number of CN113032559A A disclosed language model fine-tuning method for text classification ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/284G06N20/00G06F16/35
CPCG06F40/284G06N20/00G06F16/35
Inventor 张宁豫陈想陈华钧邓淑敏毕祯叶宏彬
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products