Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal model training method and device, electronic equipment and storage medium

A model training and federation technology, which is applied in multi-programming devices, program control design, electrical digital data processing, etc., can solve the problems of the federation model failing to converge, and achieve the goal of avoiding failure to converge, improving performance, and improving network resource utilization. Effect

Active Publication Date: 2021-06-22
SUZHOU INST FOR ADVANCED STUDY USTC
View PDF9 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patented technology allows multiple edges work nodes (also called clients) send their own world-model trained locally from one centralized entity into another without having them share any computing resource or slow down processing speed. It also uses predetermined algorithms instead of constantly updating all models at once during this phase. By doing these techniques, collaborative filtering systems improve efficiency while still maintaining good accuracy when performing complex operations such as image recognition.

Problems solved by technology

This patented describes different ways that collaborative systems use distributed processing techniques such as Federating Learning (FL) may be vulnerable due to their lacking ability to prevent unauthorized access by users during communication sessions between client devices and service providers. To address this issue, researchers have proposed solutions like secure multi-party computation protocols with encrypted versions of shared secret keys called shares on both edges and workers' computers. However these methods require significant amounts of time and resources to perform complex calculations when updating large datasets over multiple machines at once.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal model training method and device, electronic equipment and storage medium
  • Federal model training method and device, electronic equipment and storage medium
  • Federal model training method and device, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] figure 1 It is a flowchart of a federated model training method provided in Embodiment 1 of the present invention. This embodiment is applicable to the case of federated model training in an edge computing network, and the method can be executed by the federated model training device provided in the embodiment of the present invention. , the device may be implemented by software and / or hardware, and typically, the device may be integrated into a server in the federated model training system.

[0030] see further figure 2 , figure 2 A logical architecture diagram of a federated model training system is provided for an embodiment of the present invention. The federated model training system provided in this embodiment includes at least one parameter server and multiple edge devices (ie clients, also called edge working nodes). The parameter server is used to communicate with the client through the wireless network to transmit model parameters; the client is used to tr...

Embodiment 2

[0098] Figure 5 It is a schematic structural diagram of a federated model training device provided by an embodiment of the present invention, and the device is configured in a server. A federated model training device provided in an embodiment of the present invention can execute a federated model training method provided in any embodiment of the present invention, and the device includes:

[0099] The sending module 510 is configured to send the global model corresponding to the target task to each edge working node, and is also used to send the updated global model to each edge working node for the next round of local model training;

[0100] A determination module 520, configured to determine the specified number of local models participating in the global model update according to the current network resources and the number of target tasks based on a preset algorithm;

[0101] The current network resources include: current network bandwidth and current computing resourc...

Embodiment 3

[0120] Image 6 It is a schematic structural diagram of an electronic device provided by Embodiment 3 of the present invention. Image 6 A block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the invention is shown. Image 6 The electronic device 12 shown is only an example, and should not limit the functions and scope of use of the embodiments of the present invention.

[0121] Such as Image 6 As shown, electronic device 12 takes the form of a general-purpose computing device. Components of electronic device 12 may include, but are not limited to, one or more processors or processing units 16, system memory 28, bus 18 connecting various system components including system memory 28 and processing unit 16.

[0122] Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a federal model training method and device, electronic equipment and a storage medium, the method is executed by a server in a federal model training system, and the method comprises the following steps: issuing a global model corresponding to a target task to each edge working node; based on a preset algorithm, determining a specified number of local models participating in global model updating according to the current network resources and the number of the target tasks; and when a specified number of local model update data is received, performing federal aggregation to obtain an updated global model. According to the technical scheme provided by the embodiment of the invention, the number of local models participating in global model training is dynamically determined by combining the preset algorithm with the network resources of each iteration, so that the network resource utilization rate of model training in the federated learning process is effectively improved; the problem that the federated model cannot converge due to the problem that network resources are limited is avoided, and the training performance of the federated model is greatly improved.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Owner SUZHOU INST FOR ADVANCED STUDY USTC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products