Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Relational reasoning method, device and equipment based on deep neural network

A technology of deep neural network and reasoning method, which is applied in the direction of reasoning method, neural learning method, biological neural network model, etc. It can solve the problems of unreasonable use and low accuracy of relational reasoning, and achieve the effect of improving accuracy

Active Publication Date: 2019-06-18
GUANGDONG UNIV OF TECH
View PDF4 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of this application is to provide a relational reasoning method, device, equipment and computer-readable storage medium based on a deep neural network to solve the current problem based on a deep neural network The relational reasoning model does not make reasonable use of the input entity information, resulting in low accuracy of relational reasoning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Relational reasoning method, device and equipment based on deep neural network
  • Relational reasoning method, device and equipment based on deep neural network
  • Relational reasoning method, device and equipment based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] The following introduces the first embodiment of a method of relational reasoning based on deep neural network provided by this application, see figure 1 , The first embodiment includes:

[0048] Step S101: Obtain sample sentences, and construct a syntax dependency tree composed of multiple words according to the sample sentences.

[0049] The above sample sentence can be specifically a sequence of words. In this embodiment, the sample sentence is analyzed to realize the word segmentation of the sample sentence, and then the part of speech analysis is performed on the word segmentation result, and the dependency relationship between the word segmentation results is analyzed, and finally the sample sentence is obtained. Word segmentation result, part of speech result, syntax analysis tree, and syntax dependency tree (that is, the above-mentioned syntax dependency tree). Specifically, an open source parser can be used to implement the above process, such as a Stanford parser. ...

Embodiment 2

[0055] The second embodiment of a relational reasoning method based on a deep neural network provided by the present application will be introduced in detail below. The second embodiment is implemented based on the above-mentioned first embodiment, and is expanded to a certain extent on the basis of the first embodiment. Specifically, see figure 2 , The second embodiment includes:

[0056] Step S201: Obtain a sample sentence, analyze the sample sentence with a Stanford syntax parser, and obtain a syntax dependency tree composed of multiple words.

[0057] This embodiment analyzes the sample sentence to determine whether the structure of the sample sentence conforms to the given grammar, and analyzes the structure of the sentence and the relationship between the syntactic components of each level by constructing a syntax dependency tree, that is, to determine which of a sentence is Words constitute a phrase, and which words are the subject or object of the verb. Perform word segme...

Embodiment approach

[0087] As an optional implementation manner, the device further includes:

[0088] Location information adding module 904: used to add location information to each word in the syntax dependency tree according to the structure of the syntax dependency tree, where the location information includes first location information and second location information. The first position information is a path vector from the current word to the target word, the second position information is a binary array consisting of the length of the forward path and the length of the reverse path from the current word to the target word, and the target word is the waiting A pre-designated word among the two words for relational reasoning.

[0089] As an optional implementation manner, the relationship inference module 903 is specifically configured to:

[0090] The first adaptive weighting parameter and the second adaptive weighting parameter are set in advance for the main feature and the auxiliary feature, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a relation reasoning method based on a deep neural network. The method comprises the steps of after obtaining sample sentences, constructing a syntactic dependency tree consisting of a plurality of words according to a preset fusion rule, then respectively extracting a main feature of the syntactic dependency tree on a shortest dependency path and an auxiliary feature on anon-shortest dependency path, finally carrying out feature fusion on the main feature and the auxiliary feature according to the preset fusion rule, and obtaining a relation reasoning result accordingto the fusion result. Visibly, according to the method, the characteristics of the syntactic dependency tree on the shortest dependency path and the non-shortest dependency path are extracted respectively and fused, and due to the fact that the auxiliary characteristics have a certain auxiliary effect on the reasoning result, the accuracy of relation reasoning is remarkably improved by effectively utilizing the main characteristics and the auxiliary characteristics of the syntactic dependency tree. In addition, the invention further provides a relation reasoning device and equipment based onthe deep neural network and a computer readable storage medium, and the effects of the relation reasoning device and equipment correspond to the effects of the method.

Description

Technical field [0001] This application relates to the field of computers, and in particular to a method, device, equipment and computer-readable storage medium for relational reasoning based on deep neural networks. Background technique [0002] Knowledge graph is an emerging technology. Search engines generally use it to enhance the knowledge database of its search engine function. Its essence is a semantic network, which is a data structure generalized from graphs. The knowledge graph consists of nodes and edges. Nodes mainly represent entities or concepts, and edges represent attributes and association relationships. The knowledge graph is a very effective way to represent relationships. It connects a huge variety of different kinds of information in a spatial structure diagram, forming a relationship network. Among them, the association of knowledge is very important information, and the knowledge association reasoning model is generally used to predict the relationship bet...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/27G06N5/04G06N3/08
Inventor 黄国恒卢增张凡龙程良伦
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products