Text generation model training method and device, storage medium and computer equipment

A technology for generating models and training methods, applied in computer parts, biological neural network models, computing, etc., can solve problems such as failure to consider deep-level correlations, reply texts that deviate from the overall logic of multiple rounds of dialogue, and inability to guarantee the accuracy of reply text generation. , to achieve the effect of improving the generation accuracy

Pending Publication Date: 2022-03-25
PING AN TECH (SHENZHEN) CO LTD
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the process of training the deep learning model, this method does not consider whether the generated reply text is deeply related to the contextual dialogue topics in the multi-round dialogue, which leads to the deviation of the reply text generated by the deep learning model from the multi-round dialogue. The overall logic cannot guarantee the generation accuracy of the reply text

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text generation model training method and device, storage medium and computer equipment
  • Text generation model training method and device, storage medium and computer equipment
  • Text generation model training method and device, storage medium and computer equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] Hereinafter, the present invention will be described in detail with reference to the drawings and examples. It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other.

[0037] At present, in the process of training the deep learning model, it does not consider whether the generated reply text is deeply related to the contextual dialogue topics in the multi-round dialogue, which leads to the reply text generated by the deep learning model deviates from the overall logic of the multi-round dialogue , the generation accuracy of the reply text cannot be guaranteed.

[0038] In order to solve the above problems, an embodiment of the present invention provides a training method for a text generation model, such as figure 1 As shown, the method includes:

[0039] 101. Obtain question text and multiple context texts in multiple rounds of dialogue.

[0040] Wherein, at le...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a text generation model training method and device, a storage medium and computer equipment, and relates to the technical field of machine learning. The method comprises the following steps: acquiring question texts and a plurality of context texts in multiple rounds of dialogues; generating a first text vector matrix corresponding to the plurality of context texts and a reply text vector matrix corresponding to the question text by using an initial text vector generation model; utilizing an initial topic vector extraction model to respectively extract a context text topic vector corresponding to the first text vector matrix and a reply text topic vector corresponding to the reply text vector matrix; constructing a loss function based on the context text topic vector and the reply text topic vector; and jointly carrying out iterative training on the initial text vector generation model and the initial topic vector extraction model based on the loss function, and constructing a text generation model. The method is suitable for training the text generation model.

Description

technical field [0001] The present invention relates to the technical field of machine learning, in particular to a training method, device, storage medium and computer equipment for a text generation model. Background technique [0002] In natural language processing, multi-round dialogue is always a research hotspot, and how to accurately and effectively generate reply text is of great significance to the research of multi-round dialogue. [0003] At present, it is usually based on the understanding of the context in multiple rounds of dialogue, using a deep learning model to generate the corresponding reply text. However, in the process of training the deep learning model, this method does not consider whether the generated reply text is deeply related to the contextual dialogue topics in the multi-round dialogue, which leads to the deviation of the reply text generated by the deep learning model from the multi-round dialogue. The overall logic cannot guarantee the accur...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/31G06F16/33G06F40/258G06F40/289G06K9/62G06N3/04
CPCG06F16/316G06F16/3344G06F40/289G06F40/258G06N3/044G06F18/214
Inventor 舒畅陈又新肖京
Owner PING AN TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products