Model training method and device, electronic equipment and storage medium

A model training and model technology, applied in the fields of artificial intelligence and deep learning, can solve problems such as failure to go online, failure of the model to meet performance requirements, failure to obtain labeled data, etc., to achieve the effect of reducing capital costs and manpower consumption

Pending Publication Date: 2021-01-19
BAIDU (CHINA) CO LTD
View PDF0 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method relies on a large amount of manually labeled data, which requires a lot of manpower, money and time to invest in data labeling work; when the task is urgent, it may not be possible to obtain a large amount of labeled data, which will lead to model failure. If the performance requirements cannot be met in the short term, it cannot be launched; if the training task is to control an extremely important risk, then if any risk leaks, it will cause irreparable consequences to the company
It can be seen that the use of existing model training methods not only has low model training efficiency, but may also lead to immeasurable consequences

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method and device, electronic equipment and storage medium
  • Model training method and device, electronic equipment and storage medium
  • Model training method and device, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0031] figure 1 It is the first schematic flow chart of the model training method provided by the embodiment of the present application. The method can be executed by a model training device or an electronic device. The device or electronic device can be implemented by software and / or hardware. The device or electronic device Can be integrated in any smart device with network communication function. Such as figure 1 As shown, the model training method may include the following steps:

[0032] S101. Obtain a task category label input by a user.

[0033] In this step, the electronic device may obtain the task category label input by the user. Specifically, the task category label input by the user is in text format. For example, the task type tag entered by the user is "dog".

[0034] S102. Generate at least one training sample corresponding to the task category label based on the task category label.

[0035] In this step, the electronic device may generate at least one t...

Embodiment 2

[0042] figure 2 is a second schematic flowchart of the model training method provided in the embodiment of the present application. Further optimization and expansion based on the above technical solution, and may be combined with each of the above optional implementation modes. Such as figure 2 As shown, the model training method may include the following steps:

[0043] S201. Obtain a task category label input by a user.

[0044] S202. Generate at least one hyponym corresponding to the task type label based on the pre-built knowledge graph.

[0045] In this step, the electronic device may generate at least one hyponym corresponding to the task type tag based on the pre-built knowledge map. Specifically, the electronic device may input the task category label into the knowledge map, and obtain at least one hyponym corresponding to the task type label through the knowledge map. For example, the electronic device can input the task category label "dog" input by the user ...

Embodiment 3

[0054] image 3 is a third schematic flowchart of the model training method provided in the embodiment of the present application. Further optimization and expansion based on the above technical solution, and may be combined with each of the above optional implementation modes. Such as image 3 As shown, the model training method may include the following steps:

[0055] S301. Obtain a task category label input by a user.

[0056] S302. Generate at least one hyponym corresponding to the task type label based on the pre-built knowledge graph.

[0057] S303. Capture at least one image corresponding to each hyponym by using each hyponym corresponding to the task type tag as a keyword.

[0058] S304. Extract an image from all the images corresponding to the hyponym as the current image.

[0059] In this step, the electronic device may extract a sample image from at least one sample image corresponding to the task training label as the current sample image. Specifically, it i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a model training method and device, electronic equipment and a storage medium, and relates to the technical field of deep learning. The specific scheme comprises the followingsteps of receiving a task category label input by a user; generating at least one training sample corresponding to the task category label based on the task category label; extracting a training sample from all the training samples corresponding to the task category label as a current training sample; in response to the fact that the to-be-trained model does not meet the preset convergence condition, inputting the current training sample into the to-be-trained model, and training the to-be-trained model by using the current training sample; and repeatedly executing the operation of extractingthe current training sample until the model to be trained meets a preset convergence condition. According to the embodiment of the invention, model training can be realized without acquiring the training sample with the label in advance, so that the manpower consumption and the capital cost of manual labeling are greatly reduced.

Description

technical field [0001] The present application relates to the field of artificial intelligence, and further relates to the field of deep learning technology, especially a model training method, device, electronic equipment and storage medium. Background technique [0002] With the advent of the era of big data, data acquisition has become relatively easy, but the data used for training often needs to be manually screened and labeled before training. A large amount of training data means that it takes a lot of manpower, time, and money to label the data, which greatly limits the training speed of the artificial intelligence model, which in turn affects the model iteration speed and model launch time. [0003] In the existing technology, a supervised artificial intelligence model training method is usually used. Taking image classification tasks as an example, an appropriate amount of training data with labels is manually annotated, and training data is extracted based on trad...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35G06F16/36G06F40/284G06K9/62
CPCG06F16/355G06F16/367G06F40/284G06V2201/07G06F18/23213G06F18/24G06F18/214
Inventor 张言梁晓旭邓远达
Owner BAIDU (CHINA) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products