Problem text generation method and device, equipment and medium

A text and problem technology, applied in the field of computer data processing, which can solve problems such as limited types, cumbersome, time-consuming, etc.

Active Publication Date: 2018-11-20
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF4 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The method based on conversion rules relies on a large number of manual participation and summary rules, which is very cumbersome and time-consuming, and the method based on conversion rules usually lacks diversity in word rec...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Problem text generation method and device, equipment and medium
  • Problem text generation method and device, equipment and medium
  • Problem text generation method and device, equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] figure 1 It is a schematic flowchart of a method for generating question text provided in Embodiment 1 of the present invention. The embodiment of the present invention is applicable to the situation that a neural network is used to generate a corresponding question text based on an answer text. The method is executed by a question text generating device, which is implemented by software and / or hardware and is specifically configured to The electronic device of the question text may be a server or a terminal device. The question text generation method includes:

[0042] S110. Input the attribute parameters of multiple input words of the answer text to multiple models of the encoder, and use the encoder to calculate the encoding hidden layer state vector of each input word.

[0043] Wherein, the input word is a vocabulary in the answer text of the sentence, which may be a single character or a word composed of multiple characters, that is, a vocabulary with independent...

Embodiment 2

[0074] image 3 It is a schematic flowchart of a method for generating question text provided by Embodiment 2 of the present invention. The embodiment of the present invention is further optimized on the basis of the technical solutions of the foregoing embodiments.

[0075] Further, before the operation "using the attention mechanism model, calculate the context vector of the current output word according to the answer position parameter, the encoding hidden layer state vector and the decoding hidden layer state vector of the current output word", add "according to The answer position parameters of the answer input word and the non-answer input word determine the distance between the non-answer input word and the answer input word as the answer distance parameter" operation; According to the context vector and the decoding hidden layer state vector of the previous output word, the device calculates each candidate word as the decoding hidden layer state vector of the current o...

Embodiment 3

[0102] Figure 5 It is a schematic flowchart of a method for generating question text provided by Embodiment 3 of the present invention. The embodiment of the present invention is further optimized on the basis of the technical solutions of the foregoing embodiments.

[0103] Further, the operation "input the attribute parameters of multiple input words of the answer text to multiple models of the encoder, and use the encoder to calculate the encoding hidden layer state vector of each of the input words" is refined into "to encode Each two-way LSTM model in the device, input the attribute parameter of each input word in the answer text respectively, and adopt each described two-way LSTM model to calculate the encoding hidden layer state vector of each described input word ", to adopt two-way LSTM model Build an encoder, distinguish between forward word order and reverse word order, so that the encoder can avoid long-order dependency problems in complex scenarios, better captur...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

An embodiment of the invention discloses a problem text generation method and device, equipment and a medium. The method comprises the following steps: the embodiment of the invention determines the coded hidden layer state vector of each input word by adopting an encoder based on the attribute parameters of each input word; for each candidate word, the decoder and an attention mechanism model areused to determine the context vector and decoded hidden layer state vector of each candidate word as the current output word; and for at least one generation mode, the probability value of each candidate word as the current output word and the weight value of the generation mode are respectively calculated according to the context vector and the decoded hidden layer state vector, and the final probability is further determined to screen the current output word among the candidate words based on the final probability. The technical scheme improves the accuracy and diversity of generating the question text based on the answer text.

Description

technical field [0001] The embodiments of the present invention relate to computer data processing technology, and in particular to a question text generation method, device, equipment and medium. Background technique [0002] Question generation is an important branch of artificial intelligence, and has many practical application scenarios: in the field of education, automatically generating questions from reading comprehension materials can help students understand or assist teachers to examine students; in human-computer dialogue, it can generate questions based on user responses. Questions, thereby enhancing user stickiness; in the field of question answering, it can help the question answering system to automatically label more corpus. [0003] Question generation technology is generally divided into two steps. The first step is to locate the target segment of the question from natural language, that is, the answer text; the second step is to generate the question text ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F17/30G06F17/27G06N3/04
CPCG06F40/284G06N3/044G06N3/045
Inventor 孙兴武
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products