Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Deep model training method and device, electronic equipment and storage medium

A model training and model technology, applied in the information field, can solve problems such as manual labeling of labeling data, model classification or recognition ability accuracy is not as expected, and labeling accuracy is difficult to guarantee, so as to achieve high classification or recognition accuracy and reduce learning abnormalities Phenomenon, the effect of fast model training

Active Publication Date: 2021-03-30
BEIJING SENSETIME TECH DEV CO LTD
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in general, labeling data requires manual labeling
On the one hand, manually labeling all the training data requires heavy workload and low efficiency, and there are human errors in the labeling process; on the other hand, if you need to achieve high-precision labeling, for example, in the image field, you need to achieve pixel-level Segmentation, purely manual labeling to achieve pixel-level segmentation is very difficult and the labeling accuracy is difficult to guarantee
[0003] Therefore, the training of deep learning models based on purely manually labeled training data will have low training efficiency, and the trained model will not meet expectations due to the low accuracy of the training data itself, resulting in the classification or recognition accuracy of the model.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep model training method and device, electronic equipment and storage medium
  • Deep model training method and device, electronic equipment and storage medium
  • Deep model training method and device, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0122] Mutual learning weak supervision algorithm, using the surrounding rectangular frame of some objects in the picture as input, the two models learn from each other, and can output the pixel segmentation results of the object in other unknown pictures.

[0123] Taking cell segmentation as an example, at the beginning, some cells in the figure are marked with enclosing rectangles. Observation found that most of the cells are ellipses, so draw the largest inscribed ellipse in the rectangle, draw dividing lines between different ellipses, and draw dividing lines on the edges of the ellipses. as an initial supervisory signal. Train two segmentation models. Then the segmentation model predicts on this image, and the obtained prediction image and the initial label image are combined as a new supervisory signal. The two models use the integration results of each other, and then repeat the training of the segmentation model, so the segmentation in the image is found The results ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a deep model training method and device, electronic equipment and a storage medium. The deep learning model training method includes: obtaining the n+1th first label information output by the first model, and the first model has undergone n rounds of training; and obtaining the n+1th second label information output by the second model labeling information, the second model has undergone n rounds of training; n is an integer greater than 1; based on the training data and the n+1th first labeling information, the n+1th training set of the second model is generated, And based on the training data and the n+1th second labeling information, generate the n+1th training set of the first model; input the n+1th training set of the second model into the described Two models, performing the n+1th round of training on the second model; inputting the n+1th training set of the first model into the first model, and performing the n+1th round on the first model train.

Description

technical field [0001] The present invention relates to the field of information technology, in particular to a deep model training method and device, electronic equipment and a storage medium. Background technique [0002] The deep learning model can have a certain classification or recognition ability after being trained by the training set. The training set generally includes: training data and labeled data of the training data. However, in general, labeling data requires manual labeling. On the one hand, manually labeling all the training data requires heavy workload and low efficiency, and there are human errors in the labeling process; on the other hand, if you need to achieve high-precision labeling, for example, in the image field, you need to achieve pixel-level Segmentation, purely manual labeling to achieve pixel-level segmentation is very difficult and the labeling accuracy is difficult to guarantee. [0003] Therefore, the training of deep learning models bas...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06V10/764G06V10/774
CPCG06T7/11G06T2207/20081G06T2207/20084G06T2207/10056G06T2207/20016G06T2207/20104G06T2207/20096G06T2207/30024G06V20/69G06V10/454G06V2201/03G06V10/82G06N20/00G06V10/764G06V10/774G06N3/044G06N3/045G06N3/08G06F18/00G06T7/12G06F18/214
Inventor 李嘉辉
Owner BEIJING SENSETIME TECH DEV CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products