Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Medical application model training method and device based on BERT model

A technology for model training and medical application. It is applied in the fields of instruments, electrical digital data processing, and computing. It can solve problems such as poor applicability, low fault tolerance, and general performance, and can enhance the overall semantic representation ability and semantic understanding ability. , the effect of improving comprehension

Pending Publication Date: 2021-02-09
BEIJING NUODAO COGNITIVE MEDICAL TECH CO LTD
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In order to ensure the versatility of the BERT model, the large-scale corpus based on BERT covers various knowledge fields. The pre-trained language model trained based on such corpus can be used to solve natural language problems in different fields, but it will also lead to training with such corpus. The performance of the pre-trained language model in some professional fields is average, and it cannot adapt to solve natural language problems in professional fields.
[0004] Although the current pre-training language model performs well in the general general field, because the large-scale corpus it is based on is not specific to a specific field, the currently commonly used pre-training language model cannot solve the problem of natural language in the professional field. solving issues
In the medical field, this shortcoming is particularly serious, because the medical field is extremely professional, and the use of deep learning models in the medical field has a lower degree of error tolerance, so the commonly used pre-trained language models such as BERT, etc. Not good, unable to solve natural language problems in some specific research scenarios in the medical field

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Medical application model training method and device based on BERT model
  • Medical application model training method and device based on BERT model
  • Medical application model training method and device based on BERT model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0049] Combine below Figure 1-Figure 4 A BERT model-based medical application model training method according to an embodiment of the present invention is described.

[0050] figure 1 A flow chart of a BERT model-based medical application model training method provided by an embodiment of the present invention; figure 2 A sample acquisition f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a medical application model training method and device based on a BERT model. The method comprises the steps of obtaining evidence-based medical training samples; performing entity vocabulary shielding on the evidence-based medical training sample to obtain an MLM training sample; carrying out MLM training on a BERT model by utilizing the MLM training sample to obtain a PICOBERT model, wherein the entity vocabularies correspond to entities with practical significance in evidence-based medicine. MLM training is carried out by using an MLM training sampleobtained by shielding entity vocabularies to obtain a PICOBERT model, so that the overall semantic representation ability of the model is enhanced, the semantic comprehension ability of the trained PICOBERT model is stronger, the processing ability to natural language problems of complex scenes in a specific field is stronger, the invention can be better applied to the medical field, and the understanding capability of natural language in a specific research scene in the medical field is improved.

Description

technical field [0001] The invention relates to the technical field of natural language processing, in particular to a BERT model-based medical application model training method and device. Background technique [0002] In the field of natural language processing today, the pre-trained language model has created a new paradigm of research and refreshed the best level of many natural language processing tasks. The pre-training language model is to perform language model pre-training (Pre-training) based on a large amount of unsupervised corpus, and then use a small amount of labeled corpus for fine-tuning (Fine-tuning) to complete downstream NLP such as text classification, sequence labeling, machine translation, and reading comprehension. Task. [0003] The pre-training language model BERT introduces two pre-training tasks, MLM (Masked Language Model) and NSP (NextSentence Prediction, NSP), and pre-trains on a larger corpus. good indicator. In order to ensure the versatil...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/284G06F40/30
CPCG06F40/284G06F40/30
Inventor 刘静周永杰王则远
Owner BEIJING NUODAO COGNITIVE MEDICAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products