Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for optimizing machine reading comprehension ability based on hierarchical attention mechanism

A reading comprehension and attention technology, applied in the fields of instruments, electronic digital data processing, special data processing applications, etc., can solve the problem of incomplete extraction of paragraph and text interaction information, text information, semantic information, insufficient lexical information, and answer prediction accuracy. Low problems, to achieve the effect of low cost and simple implementation process

Active Publication Date: 2021-06-15
HUNAN UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of the above defects or improvement needs of the prior art, the present invention provides a method and system for optimizing machine reading comprehension capabilities based on a hierarchical attention mechanism. Insufficient text information, semantic information, and lexical information provided, as well as the technical problem of low answer prediction accuracy due to the inability to fully extract the interaction information between paragraphs and text

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for optimizing machine reading comprehension ability based on hierarchical attention mechanism
  • Method and system for optimizing machine reading comprehension ability based on hierarchical attention mechanism
  • Method and system for optimizing machine reading comprehension ability based on hierarchical attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0056] The main idea of ​​the technology of the present invention is to use multi-granularity word vector technology to increase the input features and input text information of the machine reading comprehension model, and use the attention mechanism to make the model calculate the representation containing various semantic and grammatical information, and further learn from it Select important ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for optimizing a machine's reading comprehension ability based on a hierarchical attention mechanism, which uses multi-granularity input features to extract text information, and then uses an attention mechanism to interact with the extracted text information to obtain context information. Feature representation, and finally use new feature representations to get answers based on related questions. In the feature input stage, not only the regular vector representation of a simple word is input, but also the word vector of the pre-trained model of the word is input. At the same time, using the method of knowledge graph, the word vector of the external knowledge base named entity knowledge base NELL is also input. In the information interaction stage, various attention mechanisms are used. The question-aware text word representation is obtained by using the gated paragraph-to-question attention mechanism, and then through the linear self-matching mechanism, the question-aware text word can further perceive the text information before and after each text word, and then generate an answer.

Description

technical field [0001] The invention belongs to the technical field of natural language processing, and more specifically relates to a method and system for optimizing machine reading comprehension capabilities based on a hierarchical attention mechanism. Background technique [0002] With the emergence of large-scale high-quality data sets and the continuous improvement of modern computer computing power, machine reading comprehension technology has developed rapidly, and it has been verified that the accuracy rate is higher than that of humans in many fields. Today, machine reading comprehension technology is close to maturity and has been widely used in engineering practice, such as Taobao's 24-hour machine customer service, China Mobile's smart phone customer service, etc. Machine reading comprehension technology can replace manual 24-hour work and answer some common questions. The problem. [0003] Existing machine reading comprehension models usually use single word v...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F16/33G06F16/332G06F16/35G06F40/30
CPCG06F16/3329G06F16/3344G06F16/35
Inventor 吴帆黄小青李肯立
Owner HUNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products