Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning algorithm component priority scheduling implementation method and device and storage medium

A priority scheduling and learning algorithm technology, applied in the computer field, can solve problems such as the inability to realize the automatic scheduling of algorithm components, the inability to customize the execution order of algorithm components, etc.

Active Publication Date: 2022-07-01
蓝象智联(杭州)科技有限公司
View PDF8 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the embodiments of the present invention is to provide a method, device, and storage medium for implementing priority scheduling of federated learning algorithm components to solve the problem that the existing technology cannot customize the execution order of algorithm components, thus resulting in the inability to implement priority scheduling among algorithm components. Problems with Automated Scheduling

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning algorithm component priority scheduling implementation method and device and storage medium
  • Federal learning algorithm component priority scheduling implementation method and device and storage medium
  • Federal learning algorithm component priority scheduling implementation method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The embodiments of the present invention are described below by specific specific embodiments. Those who are familiar with the technology can easily understand other advantages and effects of the present invention from the contents disclosed in this specification. Obviously, the described embodiments are part of the present invention. , not all examples. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

[0029] In addition, the technical features involved in the different embodiments of the present invention described below can be combined with each other as long as they do not conflict with each other.

[0030] An embodiment of the present invention provides a method for implementing priority scheduling of components of a federated learning algorithm, refer to figure 1 , figure 1 This is a flowchart of a meth...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a federated learning algorithm component priority scheduling implementation method and device and a storage medium, and the method comprises the steps: obtaining an execution process of an algorithm component, analyzing the execution process of the algorithm component into a directed acyclic graph, and carrying out the inversion of the directed acyclic graph, generating a reverse directed acyclic graph; performing topological sorting on the reverse directed acyclic graph to generate an execution sequence; the execution sequences are traversed, the execution weight of each node is calculated, the execution weight is the sum of the node self weight of the node and the node in-degree weight of the node, the execution sequences are arranged in a descending order based on the size of the execution weights, and the priority scheduling sequence of the algorithm components corresponding to the nodes is obtained. According to the method, the execution weight is calculated according to the generated execution sequence, and the execution sequence of the algorithm components is defined according to the execution weight, so that automatic scheduling among the algorithm components according to the priority is realized.

Description

technical field [0001] The present invention relates to the field of computer technology, in particular to a method, device and storage medium for implementing priority scheduling of components of a federated learning algorithm. Background technique [0002] Today, when the data is increasing day by day and the data is becoming more and more closely connected, due to user privacy, laws and regulations and other issues, many data cannot be communicated with each other, resulting in many "data silos". The concept of federated learning (Federated Learning) was proposed by Google in 2017 to solve the joint modeling problem between cross-devices. This scheme provides a feasible solution to the above problems. During the modeling process, modelers usually need to go through processes such as data reading and writing, data preprocessing, statistical analysis, feature engineering, machine learning, prediction, and evaluation. On the federated modeling platform, these operations are...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/48G06F9/445G06N20/00
CPCG06F9/4881G06F9/44526G06N20/00
Inventor 朱振超宋鎏屹裴阳
Owner 蓝象智联(杭州)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products