Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Large-batch decentralized distributed image classifier training method and system

An image classifier, decentralized technology, applied in the field of image classification and machine learning, can solve the problem of image classifier generalization performance degradation, etc., to achieve the effect of reducing the number of communications, simple and easy to implement, and no overhead

Pending Publication Date: 2022-03-15
NANJING UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, blindly increasing the batch size will lead to a decline in the generalization performance of the final image classifier trained, so it is necessary to design a special image classifier training method suitable for large batch training.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Large-batch decentralized distributed image classifier training method and system
  • Large-batch decentralized distributed image classifier training method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Below in conjunction with specific embodiment, further illustrate the present invention, should be understood that these embodiments are only used to illustrate the present invention and are not intended to limit the scope of the present invention, after having read the present invention, those skilled in the art will understand various equivalent forms of the present invention All modifications fall within the scope defined by the appended claims of the present application.

[0033] A large-batch decentralized distributed image classifier training method is suitable for scenarios where there are a large number of samples in the image dataset to be processed and a large model is used. Taking the distributed training of a neural network model as an image classifier as an example, the specific workflow of the method in this embodiment is as follows:

[0034] A large batch of decentralized distributed image classifier training method, its workflow on the kth working node i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a large-batch decentralized distributed image classifier training method and system, and the method comprises the steps: enabling each working node to use a local image classifier parameter, calculating a random gradient according to a locally stored image sample, carrying out the normalization processing of the gradient, and updating the momentum and the local parameter through employing the normalized gradient. And each node communicates with the neighbor node to obtain the latest image classifier parameter, and the latest image classifier parameter and the local image classifier parameter of the node are subjected to weighted averaging to participate in the next round of updating as a new local parameter. And continuously repeating the training steps until a stop condition is reached, stopping each node, and taking a parameter average value on each node as a final output parameter. According to the method, a central node is cancelled, the problem of congestion at the central node is avoided, meanwhile, the method is suitable for large-batch image classifier training, and the number of parameter updating and communication times can be reduced through large-batch training, so that computing resources such as a GPU can be fully utilized, and the training efficiency is greatly improved.

Description

technical field [0001] The invention relates to a large-scale decentralized distributed image classifier training method and system, belonging to the technical fields of image classification and machine learning. Background technique [0002] The training of many image classifiers can be formalized as solving an optimization problem of the following finite sum form: [0003] [0004] Where x is the parameter of the model, d represents the dimension of the model parameter, n represents the total number of training samples, ξ i Refers to the i-th sample, f(x; ξ i ) is the loss function corresponding to the i-th sample. [0005] In recent years, deep learning has developed vigorously, and the continuous emergence of large data sets and large models has made the computing power of a single machine no longer meet the demand. Distributed machine learning technology that uses multiple machines to work together to complete training tasks has become an important solution to this...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/082G06N3/084G06N3/045
Inventor 李武军史长伟
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products