Model training method, model training device, storage medium and electronic equipment

A technology for model training and training model, applied in the field of data processing, can solve the problem of low detection accuracy of RGB pixel matrix, and achieve the effect of improving object detection ability and reducing labeling cost

Pending Publication Date: 2020-12-22
BEIJING DIDI INFINITY TECH & DEV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the RGB pixel matrix is ​​easily affected by the acquisition environment, such as the light intensity and visibility of the external environment. Therefore, when performing object detection on the RGB pixel matrix obtained in different environments, there will be certain limitations in the way of machine learning. For The detection accuracy of the RGB pixel matrix obtained in some environments is high, while the detection accuracy of the RGB pixel matrix obtained in other environments is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method, model training device, storage medium and electronic equipment
  • Model training method, model training device, storage medium and electronic equipment
  • Model training method, model training device, storage medium and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention is described below based on examples, but the present invention is not limited to these examples. In the following detailed description of the invention, some specific details are set forth in detail. The present invention can be fully understood by those skilled in the art without the description of these detailed parts. In order not to obscure the essence of the present invention, well-known methods, procedures, procedures, components and circuits have not been described in detail.

[0029] Additionally, those of ordinary skill in the art will appreciate that the drawings provided herein are for illustrative purposes and are not necessarily drawn to scale.

[0030] Unless the context clearly requires, the words "including", "including" and similar words in the description should be interpreted as inclusive rather than exclusive or exhaustive; that is, "including but not limited to".

[0031] In the description of the present invention, it shoul...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a model training method, a model training device, a storage medium and electronic equipment. According to the embodiment of the invention, the method comprises steps of after a to-be-trained model of a source domain, a first image set and a second image set of a target domain are obtained, training the to-be-trained model by taking each first image as an input and taking an object recognition frame corresponding to each first image and the category of the object recognition frame as training targets, meanwhile, taking each second image as an input, obtaining outputs of the second images in different layers of the to-be-trained model so as to train the attention discrimination model and the feature discrimination model, thereby responding to convergence of a detection loss function of the to-be-trained model, an attention loss function of the attention discrimination model and a feature loss function of the feature discrimination model, and obtaining a training result;determining the trained to-be-trained model as a target model. According to the embodiment of the invention, the to-be-trained model can be trained without marking the target domain image, the object detection capability of the model for the target domain is improved, and the marking cost is reduced.

Description

technical field [0001] The invention relates to the technical field of data processing, in particular to a model training method, a model training device, a storage medium and electronic equipment. Background technique [0002] In the field of traditional vision, object detection is a very popular research direction. With the continuous development of science and technology, people can perform more accurate object detection on the target object in the RGB pixel matrix (that is, the color image collected by the image acquisition device) by means of machine learning. However, the RGB pixel matrix is ​​easily affected by the acquisition environment, such as the light intensity and visibility of the external environment. Therefore, when performing object detection on the RGB pixel matrix obtained in different environments, there will be certain limitations in the way of machine learning. For The detection accuracy of the RGB pixel matrix obtained in some environments is high, w...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/46G06K9/32G06K9/00G06N3/04G06N3/08G06N20/00
CPCG06N3/084G06N20/00G06V20/56G06V10/25G06V10/56G06N3/048G06N3/045G06F18/24
Inventor 赵震郭玉红
Owner BEIJING DIDI INFINITY TECH & DEV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products