Language model training method and system in self-reconstruction mode and computer readable medium

A language model and training method technology, applied in natural language data processing, computing, neural learning methods, etc., can solve the problems of low prediction accuracy and high cost of language model, reduce the number of model parameters, reduce model size, speed up The effect of calculating speed

Pending Publication Date: 2020-02-25
创新工场(广州)人工智能研究有限公司
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Aiming at the defects of low prediction accuracy and high cost of existing language models, the presen

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Language model training method and system in self-reconstruction mode and computer readable medium
  • Language model training method and system in self-reconstruction mode and computer readable medium
  • Language model training method and system in self-reconstruction mode and computer readable medium

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0042] In order to make the objectives, technical solutions and advantages of the present invention clearer, the following further describes the present invention in detail with reference to the accompanying drawings and implementation examples. It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.

[0043] See figure 1 , The first embodiment of the present invention provides a language model training method in a self-reconstruction manner, which includes the following steps:

[0044] Step S1: Extract at least one sentence to be trained from the pre-training text and segment it into a single word sequence, and map the corresponding single subsequence into a text matrix through position coding;

[0045] Step S2: Combine the transformer model and the self-attention mechanism to establish a neural network structure;

[0046] Step S3: The text matrix is ​​used as the input sample of th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of language processing, in particular to a language model training method in a self-reconstruction mode, which comprises the following steps of: S1, extracting at least one sentence to be trained from a pre-training text, segmenting the sentence to be trained into single word sequences, and mapping corresponding single sub-sequences into a text matrix through position coding; s2, establishing a neural network structure in combination with a transformer model and a self-attention mechanism; s3, taking the text matrix as an input sample of a neural network structure, and taking the transformer model as a parameter to train and optimize to obtain a target function; and S4, repeating the steps S1 to S3 to update the target function until a set optimization condition is reached so as to obtain a pre-training model. The invention also provides a system and a computer readable medium.

Description

【Technical field】 [0001] The present invention relates to the technical field of language processing, in particular to a language model training method, system and computer-readable medium in a self-reconstruction mode. 【Background technique】 [0002] At present, the most advanced pre-training language models are divided into two categories, namely autoregressive language model (Autoregressive Model) and autoencoding language model (Autoencoding Model); GPT and GPT2 are autoregressive language models with better performance. The training goal of the autoregressive model is to correctly guess the next word based on the previous text. BERT is a representative self-encoding language model. The training goal of BERT is to correctly infer the masked or replaced words according to the context. [0003] The above two types of pre-trained language models use the Transformer (conversion) model. The model combines AttentionEncoder (attention encoder) and Attention Decoder (attentio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F40/289G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 白佳欣宋彦
Owner 创新工场(广州)人工智能研究有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products