Check patentability & draft patents in minutes with Patsnap Eureka AI!

BERT-based sequence generation method and device

A sequence generation and sequence data technology, which is applied in the field of BERT-based sequence generation methods and devices, can solve the problems that the BERT model cannot be applied to sequence generation tasks, and cannot handle sequence generation tasks, etc., and achieve the effect of improving the processing effect

Pending Publication Date: 2021-10-22
ALIBABA GRP HLDG LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, according to the above structure of the BERT model, the natural language processing tasks that the model can currently handle are mainly concentrated on text classification tasks, such as sentiment classification; sequence tagging tasks, such as real-time word segmentation recognition, part-of-speech tagging, etc., cannot handle sequence generation tasks , such as sentence simplification, machine translation and other tasks
However, the existing sequence generation model uses a one-way left-to-right decoding method when processing sequence generation tasks, which makes the training objectives and generation methods of the BERT model significantly different from the existing sequence generation models, which cannot be well understood. Applying BERT models to sequence generation tasks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • BERT-based sequence generation method and device
  • BERT-based sequence generation method and device
  • BERT-based sequence generation method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present invention are shown in the drawings, it should be understood that the invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present invention and to fully convey the scope of the present invention to those skilled in the art.

[0027] Natural language processing is a model for studying language ability and language application. It is realized by establishing a computer algorithm framework, and it is improved through training, evaluation, and finally used in various practical systems. Scenarios where natural language processing models are applied include information retrieval, machine translation, document classification, information extraction, text mining, etc. There...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a BERT-based sequence generation method and device, relates to the technical field of natural language processing, and mainly aims to process a sequence generation task by using a BERT model. According to the main technical scheme, the method comprises the steps of obtaining a sequence generation model constructed based on a BERT model; setting iteration parameters of the sequence generation model; inputting first sequence data to the sequence generation model; and generating second sequence data by the sequence generation model according to the first sequence data and iteration parameters.

Description

technical field [0001] The present invention relates to the technical field of natural language processing, in particular to a BERT-based sequence generation method and device. Background technique [0002] The BERT model is a language model proposed by Google based on a bidirectional Transformer. The BERT model combines the pre-training model and the downstream task model, that is to say, the BERT model is still used when doing downstream tasks. The full name of BERT is Bidirectional Encoder Representations from Transformers, which is a two-way encoder representation based on Transformer. Among them, the two-way meaning means that when processing a word, it can consider the information of the words before and after the word, so as to obtain the semantics of the context . BERT pre-trains deep bidirectional representations by jointly conditioning the context in all layers, thus, the pre-trained BERT representations can be fine-tuned with an additional output layer to quickl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/126G06F40/30
CPCG06F40/126G06F40/30
Inventor 张志锐骆卫华陈博兴
Owner ALIBABA GRP HLDG LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More