Supercharge Your Innovation With Domain-Expert AI Agents!

Non-autoregression sentence sorting method

A sorting method and autoregressive technology, applied in the field of non-autoregressive sentence sorting, can solve problems such as high algorithm complexity, high overhead, and inability to realize prediction in parallel

Active Publication Date: 2021-09-10
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, it has two deficiencies. (1) The existing sentence sorting method uses a pointer network to recursively predict the order of each sentence step by step. This auto-regression method is inefficient, and the algorithm complexity is high, and the prediction cannot be realized in parallel. , r

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Non-autoregression sentence sorting method
  • Non-autoregression sentence sorting method
  • Non-autoregression sentence sorting method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] Specific embodiments of the present invention will be described below in conjunction with the accompanying drawings, so that those skilled in the art can better understand the present invention. It should be noted that in the following description, when detailed descriptions of known functions and designs may dilute the main content of the present invention, these descriptions will be omitted here.

[0045] The existing sentence sorting method uses Bi-LSTM to extract the basic sentence feature vector when encoding, and uses the self-attention mechanism to extract the sentence features combined with the context in the paragraph, and then obtains the paragraph features through the average pooling operation. Pay special attention, here uses Transformer variant structure with position encoding removed. When decoding, the pointer network architecture is used as the decoder. The decoder is composed of LSTM units. The basic sentence feature vector is used as the input of the d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

In order to realize parallel prediction, improve prediction efficiency, solve error accumulation, improve the performance of a sentence sorting task, and in view of that the length of a generated target in the sentence sorting task has certainty and sentences and positions have complete matching, the invention innovatively designs a non-autoregression sentence sorting method. The method adopts a non-autoregression decoder, and fully utilizes context sentence features obtained by a Transform variant structure, so that sentences at each position are predicted in parallel, and the problems of low efficiency and error accumulation caused by recurrent prediction of a sentence sequence by a recurrent neural network decoder are effectively avoided.

Description

technical field [0001] The invention belongs to the technical field of sentence sorting, and more specifically relates to a non-autoregressive sentence sorting method. Background technique [0002] Sentence ranking is one of the basic and common tasks in modeling document coherence, and its goal is to reorganize a set of sentences into a coherent piece of text. [0003] Existing sentence ranking methods, usually employing an encoder-decoder architecture, utilize pointer networks for sequence prediction. Since the sentences in the input paragraph are unordered, the encoding method of the cyclic neural network maps all sentence representations to the feature vector of the paragraph, which will capture the wrong semantic logic between sentences, thus misleading the decoder to predict incoherent Paragraphs, obviously different arrangements of the same paragraph may get different paragraph representation features and lead to different output sentence orders. [0004] Inspired b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F40/211G06F40/126G06N3/04G06N3/08
CPCG06F40/211G06F40/126G06N3/04G06N3/08
Inventor 杨阳史文浩宾燚丁玉娟
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More