Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Text Summarization Model Generation Algorithm Fusing Information Selection and Semantic Association

A semantic association and information selection technology, applied in semantic analysis, semantic tool creation, digital data information retrieval, etc., to achieve the effect of improving the quality of abstracts

Active Publication Date: 2021-11-30
FUZHOU UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Step S2: Use the copy mechanism and the coverage mechanism to solve the problem of unregistered words and the problem of duplication of generated summary clauses;

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Text Summarization Model Generation Algorithm Fusing Information Selection and Semantic Association
  • A Text Summarization Model Generation Algorithm Fusing Information Selection and Semantic Association
  • A Text Summarization Model Generation Algorithm Fusing Information Selection and Semantic Association

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0080] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0081] It should be pointed out that the following detailed description is exemplary and intended to provide further explanation to the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.

[0082] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combinatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a text summary model generation algorithm that fuses information selection and semantic association. First, based on the Encoder-Decoder model, the attention mechanism is combined to obtain sufficient information for the input sequence; then the copy mechanism and the coverage mechanism are used to solve the problem of generation The problem of unregistered words in the abstract and the problem of repeated clauses; then design a selection network, through which the original text is re-encoded to filter redundant information; finally, by comparing the semantic correlation between the original text and the abstract, correct the abstract Semantics, improve the semantic association between the abstract and the original text.

Description

technical field [0001] The invention relates to the field of information selection and semantic association, in particular to a text summarization model generation algorithm that integrates information selection and semantic association. Background technique [0002] The Seq2Seq model based on the encoder-decoder framework was proposed and popularized by Sutskever et al. in the field of machine translation in 2014. Its significance lies in learning features from the data based entirely on the data itself, and compared to other abstract summarization methods Can get better results. In the paper published in 2015, Rush et al. combined the neural language model and the context-based input encoder to propose a sentence summarization model based on the encoder-decoder framework, which generates summaries one by one given the input sentence. of each word. Lopyrev et al. used LSTM as the model of the encoder-decoder framework, and used the attention model to generate the headline...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/335G06F16/36G06F40/30
CPCG06F40/30
Inventor 郭文忠陈立群郭昆陈羽中
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products