Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Text recognition model training method, model training device and electronic equipment

A text recognition and model training technology, applied in the field of deep learning and small samples, can solve the problem of not considering the importance of sentences, not fully supplementing sample category knowledge, etc., to achieve the effect of solving inaccurate judgments

Pending Publication Date: 2022-08-02
AEROSPACE INFORMATION RES INST CAS
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Most existing methods only focus on the relationship between labeled and unlabeled, and do not fully supplement the category knowledge between samples. In addition, the existing few-shot learning methods do not consider the importance of different words in sentences to sentences.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text recognition model training method, model training device and electronic equipment
  • Text recognition model training method, model training device and electronic equipment
  • Text recognition model training method, model training device and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] In order to make the objectives, technical solutions and advantages of the present invention more clearly understood, the present invention will be further described in detail below in conjunction with specific embodiments and with reference to the accompanying drawings.

[0052] It should be understood, however, that these descriptions are exemplary only, and are not intended to limit the scope of the present invention. In the following detailed description, for convenience of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, that one or more embodiments may be practiced without these specific details. Also, in the following description, descriptions of well-known technologies are omitted to avoid unnecessarily obscuring the concepts of the present invention.

[0053] The terminology used herein is for the purpose of describing specific embodiment...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a text recognition model training method which comprises the steps that sample enhancement is carried out on a training text to obtain a plurality of training corpora, the training corpora are provided with labels, and the training text comprises a natural language text; inputting the training corpus into the first model to obtain a vectorized representation of the training corpus; calculating the weight information of each word in the single training corpus relative to the training corpus and the similarity information between the vectorized representations of the plurality of training corpuses; training the first model by using the similarity information and the weight information; inputting the vectorized representation into a second model to obtain a category score of the training corpus; and training a third model by using the category score and the vectorization representation, and determining the trained third model as a text recognition model. The invention further discloses a model training device, electronic equipment, a storage medium and a computer program product.

Description

technical field [0001] The invention belongs to the technical field of deep learning and small samples, and in particular relates to a text recognition model training method, a model training device, an electronic device, a storage medium and a computer program product. Background technique [0002] Deep learning models have achieved state-of-the-art results in tasks such as image classification and text classification. But the success of deep learning models relies heavily on large amounts of training data. In real-world scenarios, some categories have only a small amount of data or a small amount of labeled data, and labeling unlabeled data will consume a lot of time and manpower. In contrast, humans can learn quickly with only a small amount of data. For example, humans can quickly make judgments on newly given samples based on the previously learned knowledge combined with the labels of several currently given samples. Similar small-sample learning goals are to make ma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F40/279G06F40/216G06F16/35G06K9/62
CPCG06F40/279G06F16/355G06F40/216G06F18/22
Inventor 金力李树超刘庆李晓宇孙显张雅楠董鹏程吕博
Owner AEROSPACE INFORMATION RES INST CAS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products