Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Word sense disambiguation method based on hidden Markov model

A Hidden Markov and Word Sense Disambiguation technology, applied in semantic analysis, character and pattern recognition, special data processing applications, etc., can solve the problems of a large amount of manpower and material resources, sparse model parameter data, insufficient support for a large number of disambiguation tasks, etc. , to achieve the effect of improving accuracy and correct rate

Inactive Publication Date: 2018-05-25
FOCUS TECH
View PDF1 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The training of the word sense disambiguation model requires a large amount of word sense tagged corpus data, but building a word sense tagged corpus requires a lot of manpower and material resources
Due to the lack of corpus data, the model parameter data is relatively sparse, making this disambiguation method insufficient to support a large number of disambiguation tasks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Word sense disambiguation method based on hidden Markov model
  • Word sense disambiguation method based on hidden Markov model
  • Word sense disambiguation method based on hidden Markov model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] Embodiments of the present invention will be described in detail based on the flow charts described above. The implementation methods here are only examples, and equivalent changes made based on the technical essence of the present invention still fall within the protection scope of the present invention.

[0024] The solution framework of the prediction problem: given the hidden Markov model λ=(A,B,π) (A and π are semantic class sequences, B is the vocabulary sequence) and the observation sequence O=(o 1 ,o 2 ,...,o n ), solve the hidden state sequence with the largest conditional probability P(Q|O) of the observation sequence, that is, the semantic sequence. The previous probability is obtained through corpus training, and the hidden sequence is generally solved using the Viterbi algorithm, that is, dynamic programming is used to solve the hidden Markov model prediction problem.

[0025] Step 1 uses the (Harbin Institute of Technology) artificial semantic annotatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A word sense disambiguation method based on a hidden Markov model includes step 1) of training corpus, using a SemEval 2007#task5 test corpus set to parse a to-be-disambiguated sentence; performing word segmentation on the sentence; step 2) of finding ambiguous words in the sentence after the word segmentation, extracting a target ambiguous word and segmented words on the left and right of the target ambiguous word; training corpora, and calculating semantic class vocabulary transition probability and semantic class transition probability; step 3) of extracting the number of sentences containing the ambiguous word from manually annotated corpora, calculating the observation probability, and calculating the observation probability of words on the left and right of the ambiguity word; step 4) of using the value trained by the previous corpus to calculate the state transition probability, the extracted initial state probability, the observation probability and the state transition probability as parameters of the hidden Markov model, and using the well constructed disambiguation model to disambiguate the sentence in the test corpus; and step 5) of using a similarity calculation methodto verify the accuracy of disambiguation results.

Description

technical field [0001] The present invention relates to a word sense disambiguation method based on dictionaries and machine learning theories. This method has good application in natural language processing problems, such as: machine translation, information extraction, speech recognition and syntax analysis, etc., and has strong expansion Sex and flexibility. Background technique [0002] There are a large number of polysemous words in natural language. In different scenarios, the meanings expressed are not the same. How to determine which word meaning each ambiguous word corresponds to as the correct meaning of the word in a given context is the key to word sense disambiguation. solved problem. Generally, word sense disambiguation is processed. If the corresponding parts of speech are different among the multiple meanings of the ambiguous word, the correct word sense matching can be selected, which can be completed in the part-of-speech tagging stage. At present, word s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/27G06K9/62
CPCG06F40/289G06F40/30G06F18/295
Inventor 陈宏王宇轩
Owner FOCUS TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products