Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Variational auto-encoder for text generation

A self-encoder and text technology, applied in instruments, biological neural network models, natural language data processing, etc., can solve problems such as poor convergence effect, long training time, weak text semantic learning ability, etc.

Pending Publication Date: 2020-11-13
CHINA JILIANG UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Based on this, in order to solve the problems of traditional variational autoencoders such as weak text semantic learning ability, long training time, and poor convergence effect, a text generation algorithm with strong text semantic learning ability, short training time consumption, and good convergence effect is provided. Variational Autoencoder

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Variational auto-encoder for text generation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail through the following embodiments and in conjunction with the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the present application, not to limit the present application.

[0022] The serial numbers assigned to components in this document, such as "first", "second", etc., are only used to distinguish the described objects, and do not have any sequence or technical meaning.

[0023] See figure 1 , the application provides a variational autoencoder 00 for text generation including an encoder 01 and a decoder 02 . The encoder includes 01, the first nesting layer 011, the first bidirectional long-short-term memory network layer 012, and the first sampling module 013. The encoder 01 is used to learn the latent semantic features 014 of the input t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a variational auto-encoder for text generation. A text information vector, a mean value feature vector and a maximum value feature vector are constructed by utilizing an outputmatrix of a bidirectional long-short-term memory network layer in a traditional variational auto-encoder; the variational auto-encoder is used for learning sequence features and semantic features of text corpora, optimization of the encoding process of a traditional variational auto-encoder is achieved on the basis of the sequence features and the semantic features, and therefore the training process of the variational auto-encoder for text generation has a better convergence effect and more stable text generation performance.

Description

technical field [0001] This application relates to the field of natural language processing, in particular to a variational autoencoder for text generation. Background technique [0002] With the continuous improvement of the strategic position of artificial intelligence in the fields of economy, politics, and national defense, natural language processing has also ushered in new opportunities for development. More and more natural language processing technologies have been successfully applied in sentiment analysis, document summarization, information Extraction, question answering system, machine translation and other fields. In recent years, text generation tasks such as AI poetry writing and news report generation have also attracted much attention. The text generation task in natural language processing tries to simulate the generation process of real text by constructing an autoregressive model, and then solves the approximate distribution of the language model of the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/126G06F40/284G06N3/04
CPCG06F40/126G06F40/284G06N3/044G06N3/045
Inventor 徐新胜王庆林
Owner CHINA JILIANG UNIV
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More