Unlock instant, AI-driven research and patent intelligence for your innovation.

Computer-implemented training method, classification method and system, and computer-readable recording medium

A training method, computer technology, applied in computer parts, computing, neural learning methods, etc., can solve problems such as sensitive initialization conditions

Pending Publication Date: 2021-12-10
TOYOTA JIDOSHA KK +1
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, many of these methods are sensitive to initialization conditions and / or tend to converge to degenerate solutions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computer-implemented training method, classification method and system, and computer-readable recording medium
  • Computer-implemented training method, classification method and system, and computer-readable recording medium
  • Computer-implemented training method, classification method and system, and computer-readable recording medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0133] Reference will now be made in detail to the exemplary embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings.

[0134] training method

[0135] First, an exemplary training method of training a classifier Φη to output a prediction for a sample Xi will be proposed, which constitutes the first embodiment of the present disclosure. exist figure 1 , the steps of the method are schematically shown.

[0136] S0) Prepare

[0137] In order to implement the method, an initial dataset of samples (images in this example), called source dataset SD, is required. The data set preferably includes a variety of samples (images) large enough to adequately train a classifier to identify and distinguish between different individually identifiable clusters of the data set.

[0138] Select a predecessor task. For example, in the case of images, the pre-task usually consists in trying to learn the visual features of the images.

[0139] Se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A computer-implemented method for training a classifier ([phi][eta]), comprising: S10) training a pretext model ([phi][theta]) to learn a pretext task, so as to minimize a distance between an output of a source sample via the pretext model ([phi][theta]) and an output of a corresponding transformed sample via the pretext model ([phi][theta]), the transformed sample being a sample obtained by applying a transformation (T) to the source sample; S20) determining a neighborhood (NXi) of samples (Xi) of a data set (SD) in the embedding space; S30) training the classifier ([phi][eta]) to predict respective estimated probabilities [phi][eta]j(Xi), j=1..C, for a sample (Xi) to belong to respective clusters (Cj), by using a second training criterion which tends to: maximize a likelihood for a sample and its neighbors (Xj) of its neighborhood (Nxi) to belong to the same cluster; and force the samples to be distributed over several clusters.

Description

technical field [0001] The present disclosure relates to training methods to learn parametric models for classifying images, or more generally samples, without using ground-truth annotations. Background technique [0002] Artificial neural networks are capable of extracting information from large-scale datasets. Increasing the amount of training data often improves its performance and robustness. [0003] As a result of this development, the training datasets required to train neural networks have grown exponentially over the past few years. Therefore, there is a growing need to fully or at least partially train neural networks in an unsupervised manner to reduce the need for ground truth annotations. This requires special attention to the classification neural network used to classify samples (input data) among a certain number of categories. [0004] Two approaches have been proposed to perform unsupervised learning. [0005] Representation learning methods use self-su...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06N3/047G06N3/045G06F18/2321G06F18/24147G06F18/2415G06F18/2148G06F18/23G06F18/24137G06V10/82G06V10/774
Inventor W·阿贝鲁斯G·奥斯莫祖里W·范甘斯贝克S·范登亨德M·普洛斯曼斯S·格奥尔古里斯L·梵谷
Owner TOYOTA JIDOSHA KK