Unlock instant, AI-driven research and patent intelligence for your innovation.

Cross-dataset target detection joint training method

A target detection and training method technology, applied in the field of joint training of target detection across data sets, can solve problems such as increasing repetitive work, prolonging model delivery time, affecting model accuracy, etc., achieving the effect of reducing pressure and shortening the model delivery cycle

Pending Publication Date: 2022-01-18
深圳市玻尔智造科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In view of this situation, the current common target detection training method must take all the new labels in the original data set (such as adding the label of the tricycle to each picture in the original data set of the street view detection project), and then combine with the provided new labels. Data set together to retrain the built model, otherwise the accuracy of the model will be affected. This method increases a lot of repetitive work and prolongs the delivery time of the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cross-dataset target detection joint training method
  • Cross-dataset target detection joint training method
  • Cross-dataset target detection joint training method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0027] Embodiment 1 of the present application provides a method such as figure 1 The joint training method for object detection across datasets is shown:

[0028] Step 1. Label each picture in the original data set according to the customer's initial needs, set the original data set as a labeled n label classification problem, select a suitable deep learning model training data set, and verify the model accuracy;

[0029] Step 2. The customer puts forward a new detection requirement, and if k additional label classification problems are added, and the newly added data set contains the missing label classification, then the problem becomes an n+k classification problem;

[0030] Step 3. Because each data set may contain one or more label classifications corresponding to the detection task, it is very important to ensure that the model does not lose the original accuracy during the step-by-step training process; after finding the existing data set, classify the new label Carry...

Embodiment 2

[0035] On the basis of Embodiment 1, the specific implementation of step 3 is:

[0036] Step 3.1, find m data sets containing all label categories;

[0037] Step 3.2, define the loss function of the neural network in the deep learning model as the sum of the category losses predicted by each label category in each data set, and use the loss function to train the neural network; the specific loss function mask_Loss is as follows:

[0038]

[0039]

[0040] In the above formula, pred is the output value of each type of label predicted, which is an n+k vector; label is a real label; mask is an n+k vector, and mask(j) represents the jth label of a picture in the data set The case of labeling, where j∈{1,2,…,n+k}; data refers to the data set; Indicates that the label does not belong to the corresponding data set; label(j)∈data indicates that the label belongs to the corresponding data set;

[0041] If the jth label of a picture in the data set is unlabeled, mark the mask(j...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a cross-dataset target detection joint training method. The method comprises the following steps: labeling each picture in an original dataset according to an initial demand of a customer; a client puts forward a new detection demand, and a label classification problem is added; new label classification is labeled after an existing data set is searched, or a labeled data set corresponding to a new detection requirement is newly added, and combined training is carried out by combining the labeled and trained data set. The method has the beneficial effects that all new labels in the original labeled and trained data set do not need to be supplemented again, and only labels corresponding to newly-added detection tasks need to be made for the newly-provided data set (for example, labels of tricycles are made on each picture in the new data set); and then the original data set and the new data set are combined together for retraining, so that the model precision can be ensured.

Description

technical field [0001] The invention belongs to the field of image target detection, and in particular relates to a joint training method for target detection across data sets. Background technique [0002] In the multi-label and multi-category detection tasks for industrial products, a single image in a data set generally contains multiple detection tasks, and the customer generally only provides a few of the detection requirements when initially obtaining the detection requirements (such as in the street view detection project. At the beginning, only the detection of pedestrians and cars was required), but when the detection model was trained and ready to be delivered, at this time the customer proposed to add new detection tasks and provide classification standards for new detection tasks and corresponding new data sets (such as in street view In the detection project, the customer also proposed to detect the tricycle). In view of this situation, the current common targe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62
CPCG06F18/241G06F18/214
Inventor 杨培文张成英梁惠莹于振东张辽
Owner 深圳市玻尔智造科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More