Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Semantic similarity calculation method based on deep learning

A semantic similarity and deep learning technology, applied in the field of semantic similarity calculation, can solve problems such as imperfect model feature extraction, low accuracy of similarity calculation, shallow network layers, etc., to overcome the problem of gradient disappearance and feature semantic information The effect of enriching and enhancing feature extraction capabilities

Active Publication Date: 2019-10-18
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF17 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is: the present invention provides a semantic similarity calculation method based on deep learning, which solves the problem of imperfect feature extraction of existing models and shallow network layers The problem that leads to low accuracy of similarity calculation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Semantic similarity calculation method based on deep learning
  • Semantic similarity calculation method based on deep learning
  • Semantic similarity calculation method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] Such as Figure 1-5 As shown, the present invention includes four steps: training data set construction, network model construction, model training and model prediction. The construction of the training data set and the construction of the network model are used as the basis for model training. After the model is trained, the trained model is used to calculate the semantic similarity.

[0049] 1.1 Manually construct the training data set. Each piece of data in the data set maintains a uniform format. In this application, the format is "text 1 text 2 label", and each piece of data consists of two texts, namely "text 1" and "text 2" and It consists of a label, the data example is as follows: "I want to modify the bound mobile phone number, how should I modify the bound mobile phone number 1", and separate "Text 1", "Text 2" and "Label" in each piece of data. Table symbol, if the label is 1, the two texts are similar texts, and if the label is 0, the piece of data is non-si...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a semantic similarity calculation method based on deep learning, and relates to the field of semantic similarity calculation. The method comprises the following steps: step 1,constructing a training data set, and preprocessing training data to obtain one-hot sparse vectors; step 2, constructing a semantic similarity calculation network model comprising N layers of BI-LSTMnetworks, a residual network, a similarity matrix, a CNN convolutional neural network, a pooling layer and a full connection layer; step 3, inputting the one-hot sparse vector into the network model,and training parameters by using a training data set to complete supervised training; and step 4, inputting a text to be tested into the trained network model, judging whether the text to be tested isa similar text or not, and outputting a result. The semantic similarity calculation network model comprises a multi-layer BI-LSTM network, a residual network, a CNN convolutional neural network, a pooling layer and a full connection layer. Meanwhile, a BI-LSTM network and a CNN convolutional neural network are used, and a residual network is added into the BI-LSTM network, so that the problem ofgradient disappearance caused by a multi-layer network is solved, and the feature extraction capability of the model is enhanced.

Description

technical field [0001] The invention relates to the field of semantic similarity calculation, in particular to a deep learning-based semantic similarity calculation method. Background technique [0002] Semantic similarity calculation is a basic task in the field of natural language processing. With the advent of the era of artificial intelligence, more and more scientists and scholars are focusing on the field of natural language processing. The task of semantic similarity calculation is because of its Document copy checking, information retrieval, and machine translation are widely used in fields such as machine translation, and more and more researchers are devoting themselves to the study of semantic similarity calculation. In recent years, due to the rise of deep learning technology, semantic similarity calculation has also developed by leaps and bounds. Compared with traditional methods, deep learning technology can extract deep semantics and obtain richer feature exp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/27G06K9/62
CPCG06F40/242G06F40/289G06F40/30G06F18/22
Inventor 罗光春秦科惠孛刘贵松黄为
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products