Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A text summarization model generation algorithm that fuses information selection and semantic association

A technology of semantic association and information selection, used in semantic analysis, semantic tool creation, digital data information retrieval, etc.

Active Publication Date: 2019-01-04
FUZHOU UNIV
View PDF5 Cites 36 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Step S2: Use the copy mechanism and the coverage mechanism to solve the problem of unregistered words and the problem of duplication of generated summary clauses;

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A text summarization model generation algorithm that fuses information selection and semantic association
  • A text summarization model generation algorithm that fuses information selection and semantic association
  • A text summarization model generation algorithm that fuses information selection and semantic association

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0080] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0081] It should be pointed out that the following detailed description is exemplary and intended to provide further explanation to the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.

[0082] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combinatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a text abstract model generation algorithm for fusing information selection and semantic association. Firstly, based on the Decoder model, the attention mechanism is combinedto obtain enough information of input sequence. Secondly, a copy mechanism and a coverage mechanism are used to solve the problem of non-login words and clause repetition in summary generation. Then aselective network is designed to encode the original text twice to filter the redundant information. Finally, by comparing the semantic relevance of the original text and the abstract, the semantic relevance of the abstract can be corrected, and the semantic relevance of the abstract and the original text can be enhanced.

Description

technical field [0001] The invention relates to the field of information selection and semantic association, in particular to a text summarization model generation algorithm that integrates information selection and semantic association. Background technique [0002] The Seq2Seq model based on the encoder-decoder framework was proposed and popularized by Sutskever et al. in the field of machine translation in 2014. Its significance lies in learning features from the data based entirely on the data itself, and compared to other abstract summarization methods Can get better results. In the paper published in 2015, Rush et al. combined the neural language model and the context-based input encoder to propose a sentence summarization model based on the encoder-decoder framework, which generates summaries one by one given the input sentence. of each word. Lopyrev et al. used LSTM as the model of the encoder-decoder framework, and used the attention model to generate the headline...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F16/335G06F16/36G06F17/27
CPCG06F40/30
Inventor 郭文忠陈立群郭昆陈羽中
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products