Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Question and answer task downstream task processing method and model

A task processing and downstream technology, applied in neural learning methods, electrical digital data processing, biological neural network models, etc., can solve problems such as failure to perform semantic matching

Active Publication Date: 2021-04-30
四川好久来科技有限公司
View PDF10 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] To sum up, the existing deep learning language model has the following shortcomings: (1) it will pay attention to the unimportant parts of the text, while ignoring the important parts; The impact of interfering sentences in texts with multiple identical vocabulary can only be matched through the text itself, but not semantically matched

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Question and answer task downstream task processing method and model
  • Question and answer task downstream task processing method and model
  • Question and answer task downstream task processing method and model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0073] The present invention will be further described in detail below in conjunction with the accompanying drawings.

[0074] Such as figure 1 As shown, the present invention discloses a downstream task processing method of a question answering task, comprising the following steps:

[0075] S1. Input the question and context into the pre-training language module to obtain the language-related features of the context;

[0076] The specific operation is to send the sequence composed of questions and context splicing into the pre-training language module, that is, the encoder for encoding.

[0077] S2. Using the context-based language association features of the two-way attention mechanism to obtain the contextual representation of key information perception H CKey And the problem of key information perception indicates that H QKey ;

[0078] S3, using bidirectional attention flow based on contextual representation of key information perception H CKey and key information Pe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a question and answer task downstream task processing method and model, and the method comprises the steps: obtaining a context representation HCKey perceived by key information and a question representation HQKey perceived by key information, and generating a context representation G perceived by a question; calculating an update vector z and a memory weight g based on G, and updating G to obtain an output vector Gg; generating a context granularity vector GC and a sequence granularity vector GCLS, generating an output vector Cout, using softmax to calculate the probability that each character in the context serves as an answer starting and ending position, and extracting a continuous subsequence with the maximum probability to serve as an answer. The invention provides a two-way cascading attention mechanism, and constructs a mechanism taking fine reading and slight reading as a whole and a multi-granularity module based on a granular computing thought, so that the model effectively pays attention to and screens effective information, better understands texts under various granularities, gives out more accurate answers, and improves the accuracy of text analysis. And the performance is improved on the basis of a baseline model.

Description

technical field [0001] The invention belongs to the technical field of natural language processing, and in particular relates to a downstream task processing method and model of a question answering task. Background technique [0002] Machine reading comprehension is a challenging task in natural language processing, which aims to determine the correct answer to a question based on a given context. Common machine reading comprehension tasks are divided into cloze, multiple choice, fragment extraction and free answer according to the form of the answer. The newly developed pre-trained language model has achieved a series of successes in various natural language understanding tasks by virtue of its powerful text representation ability. These pre-trained language models are used as encoders of deep learning language models to extract language-associated features of relevant texts and fine-tune them in combination with task-specific downstream data processing structures. With ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/332G06F16/33G06N3/04G06N3/08
CPCG06F16/3329G06F16/3344G06N3/08G06N3/044G06N3/045
Inventor 王勇雷冲陈秋怡
Owner 四川好久来科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products