Chinese named entity recognition method based on multilevel residual convolution and attention mechanism

A named entity recognition and attention technology, applied in neural learning methods, based on specific mathematical models, instruments, etc., to achieve the effect of improving entity recognition speed, high efficiency, and reducing overfitting

Pending Publication Date: 2021-06-08
JIANGNAN UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the recurrent neural network can make full use of historical information and future information to process current information, it

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Chinese named entity recognition method based on multilevel residual convolution and attention mechanism
  • Chinese named entity recognition method based on multilevel residual convolution and attention mechanism
  • Chinese named entity recognition method based on multilevel residual convolution and attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The technical solutions of the present invention will be further described below according to the embodiments and the accompanying drawings.

[0024] figure 2 It represents the algorithm model diagram of the present invention. The model includes five key parts: data enhancement, multi-modal vector layer, multi-level residual convolution, attention mechanism, and conditional random field. In order to better illustrate the present invention, the following is an example of the public Chinese named entity recognition data set Resume.

[0025] The data enhancement algorithm in step 1 in the above technical solution is:

[0026] Swap entities of the same type in the training set samples to generate a new training set. Then the original training set and the newly generated training set are combined as a new training set to achieve the purpose of expanding the amount of data. For example, there are two samples in the training set that contain "National People's Congress re...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a Chinese named entity recognition method based on multilevel residual convolution and an attention mechanism, and belongs to the field of natural language processing. According to the method, a multi-level residual convolutional network of a joint attention mechanism is adopted. In order to solve the problem that the model efficiency is low when a traditional recurrent neural network processes sequence information, multi-stage residual convolution is introduced to obtain local context information in different ranges, the computing power of hardware is fully utilized, and the model efficiency is remarkably improved. In addition, the recurrent neural network cannot effectively acquire global context information due to gradient disappearance and gradient explosion problems, so that the performance of the network is greatly influenced. According to the method, an attention mechanism is introduced into the network, and the importance weight of each character is calculated by constructing the relationship between each character and the sentence, so that global information is learned. Finally, the transition probability of the character tag is calculated by using the conditional random field to obtain a reasonable prediction result, and the robustness of the named entity recognition model is further improved.

Description

technical field [0001] The invention belongs to the field of natural language processing, in particular to a Chinese named entity recognition method based on multi-level residual convolution and attention mechanism. Background technique [0002] Named entity recognition has always been the focus of natural language processing research, and its main goal is to identify entities such as person names, place names, and organization names from text. As a basic task in NLP (Natural Language Processing, Natural Language Processing), named entity recognition plays an important role in tasks such as automatic question answering and relationship extraction. At present, Chinese named entity recognition is mainly divided into two types of methods based on words and characters. Since entities mostly appear in the form of words, word-based methods can make full use of word information for entity recognition, but words need to be obtained from sentences through word segmentation, and the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F40/295G06F40/30G06N3/08G06N7/00
CPCG06F40/295G06F40/30G06N3/08G06N7/01
Inventor 孔军张磊鑫蒋敏
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products