Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Language model training method and system in self-reconstruction mode and computer readable medium

A language model and training method technology, applied in natural language data processing, computing, neural learning methods, etc., can solve the problems of low prediction accuracy and high cost of language model, reduce the number of model parameters, reduce model size, speed up The effect of calculating speed

Pending Publication Date: 2020-02-25
创新工场(广州)人工智能研究有限公司
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Aiming at the defects of low prediction accuracy and high cost of existing language models, the present invention provides a self-reconstruction language model training method, system and computer-readable medium

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Language model training method and system in self-reconstruction mode and computer readable medium
  • Language model training method and system in self-reconstruction mode and computer readable medium
  • Language model training method and system in self-reconstruction mode and computer readable medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] In order to make the purpose, technical solutions and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the accompanying drawings and implementation examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0043] see figure 1 , the first embodiment of the present invention provides a language model training method in a self-reconstruction mode, which includes the following steps:

[0044] Step S1: Extract at least one sentence to be trained from the pre-training text and divide it into a single-word sequence, and map the corresponding single-word sequence into a text matrix through position encoding;

[0045] Step S2: Combining the transformer model and the self-attention mechanism to establish a neural network structure;

[0046] Step S3: the text matrix is ​​used as the input sam...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of language processing, in particular to a language model training method in a self-reconstruction mode, which comprises the following steps of: S1, extracting at least one sentence to be trained from a pre-training text, segmenting the sentence to be trained into single word sequences, and mapping corresponding single sub-sequences into a text matrix through position coding; s2, establishing a neural network structure in combination with a transformer model and a self-attention mechanism; s3, taking the text matrix as an input sample of a neural network structure, and taking the transformer model as a parameter to train and optimize to obtain a target function; and S4, repeating the steps S1 to S3 to update the target function until a set optimization condition is reached so as to obtain a pre-training model. The invention also provides a system and a computer readable medium.

Description

【Technical field】 [0001] The present invention relates to the technical field of language processing, in particular to a language model training method, system and computer-readable medium in a self-reconstruction mode. 【Background technique】 [0002] At present, the most advanced pre-training language models are divided into two categories, namely autoregressive language model (Autoregressive Model) and autoencoding language model (Autoencoding Model); GPT and GPT2 are autoregressive language models with better performance. The training goal of the autoregressive model is to correctly guess the next word based on the previous text. BERT is a representative self-encoding language model. The training goal of BERT is to correctly infer the masked or replaced words according to the context. [0003] The above two types of pre-trained language models use the Transformer (conversion) model. The model combines AttentionEncoder (attention encoder) and Attention Decoder (attentio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/289G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 白佳欣宋彦
Owner 创新工场(广州)人工智能研究有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products