Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Natural language generation method based on time sequence topic model

A topic model, natural language technology, applied in the field of natural language generation based on time series topic model, can solve the problem of unable to capture sentences and sentence time series features.

Active Publication Date: 2019-11-15
XIDIAN UNIV
View PDF7 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method inputs topic semantic information into the question answering system, which makes up for the lack of exogenous knowledge in the question answering model and increases the richness and diversity of answers. However, the single-layer topic model is not as complete as the semantic information extracted by the multi-layer topic model, and it cannot capture sentences. Timing features between sentences

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Natural language generation method based on time sequence topic model
  • Natural language generation method based on time sequence topic model
  • Natural language generation method based on time sequence topic model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] In order to further explain the technical means and effects of the present invention to achieve the intended purpose of the invention, a natural language generation method based on a temporal topic model proposed according to the present invention will be described in detail below in conjunction with the accompanying drawings and specific implementation methods.

[0052] The aforementioned and other technical contents, features and effects of the present invention can be clearly presented in the following detailed description of specific implementations with accompanying drawings. Through the description of specific embodiments, the technical means and effects of the present invention to achieve the intended purpose can be understood more deeply and specifically, but the accompanying drawings are only for reference and description, and are not used to explain the technical aspects of the present invention. program is limited.

[0053] It should be noted that in this art...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a natural language generation method based on a time sequence topic model. The natural language generation method comprises the steps of obtaining a context word bag vector ofeach sentence in a document; generating a topic distribution vector of each sentence in the document by utilizing a time sequence topic model; inputting each word of each sentence and the corresponding topic distribution vector into a sequential language model to obtain each layer of hidden variable corresponding to each word; splicing the hidden variables of each layer together, and predicting the next word in the current sentence through a normalized exponential function; utilizing a stochastic gradient descent method to update encoder parameters in the time sequence language model and the time sequence theme model; and sampling and updating decoder parameters in the time sequence topic model. According to the method, the multi-layer topic model and the multi-layer language model are combined, the hierarchical semantic features and the hierarchical time sequence information in the text topic are extracted, the semantic range of the low-layer features is small, and the semantic rangeof the high-layer features is wider.

Description

technical field [0001] The invention belongs to the technical field of natural language processing, and in particular relates to a natural language generation method based on a time series topic model. Background technique [0002] In the field of natural language processing, topic models and language models are widely used text analysis methods. The topic model analyzes the bag-of-words form of the text, only considering the number of words in the document and ignoring the temporal relationship between words in the text. The multi-layer topic model can greatly improve the ability to model text and obtain hidden variables with more semantic information. [0003] The language model performs temporal modeling on the text, which can capture the temporal relationship between words in the text, so as to realize various tasks in natural language processing, such as text summarization, machine translation, image annotation, etc. The language model usually gives the previous word,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/28G06F17/27G06N3/04G06N3/08
CPCG06N3/049G06N3/08G06N3/045
Inventor 陈渤鲁瑞颖郭丹丹
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products