Federal learning method, storage medium, terminal, server and federal learning system

A learning method and storage medium technology, applied in integrated learning and other directions, can solve problems such as inability to effectively take into account terminal power consumption and overall training time, inability to analyze data on mobile terminals, and violation of data analysis, so as to reduce overall training time and reduce training. time, and the effect of improving training efficiency

Pending Publication Date: 2021-05-28
SHENZHEN INST OF ADVANCED TECH
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, model training is usually a computationally intensive task, which typically incurs significant power consumption on mobile devices
In order to solve this problem, Google's existing solution is to train the model only when the mobile device is in the charging state and is used for the period of time when there is

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning method, storage medium, terminal, server and federal learning system
  • Federal learning method, storage medium, terminal, server and federal learning system
  • Federal learning method, storage medium, terminal, server and federal learning system

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0065]Example one

[0066]Such asfigure 1 As shown, the federal learning system of the present invention includes a server 10 and a plurality of terminals 20, and the server 10 can communicate with the terminal 20. The basic workflow of the federal learning system is that each terminal 20 transmits the respective training information to the server 10, the server 10 acquires the constraint time according to the training information of the respective terminal 20, and the terminal 20 acquires the training parameters according to the constraint time. Training parameters complete model training. The constraint time is used to cause the terminal to complete the model training within the constraint time, the training parameters to consume minimal energy consumption when the terminal is completed. This federal learning system reduces overall training time and reduces energy consumption.

[0067]The federal learning system of the present embodiment is described in detail below from the server 10 a...

Example Embodiment

[0068]Example 2

[0069]Such asfigure 2 As shown, the federal learning method of the second embodiment includes the following steps:

[0070]Step S10: The server 10 receives the training information sent from the terminal.

[0071]Before you start training, the server 10 needs to collect training information of each terminal 20 in advance. The training information includes hardware information and preset training data amount, wherein the hardware information includes information such as processor information of the terminal, mainly refers to information such as CPU frequency, CPU cycle, and preset training data refers to each terminal. Contains the number of training data, for example, for image recognition tasks, the number of training data refers to how many pictures in the terminal contain.

[0072]Step S20: The server 10 acquires the constraint time according to the training information, which is used to complete the model training within the constraint time.

[0073]Specifically, this step S2...

Example Embodiment

[0096]Example three

[0097]The third embodiment discloses a computer readable storage medium that stores a federal learning program that implements the federal learning method as described in Example 2 when the federal learning program is executed.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a federated learning method, a storage medium, a server, a terminal and a federated learning system. The federated learning method comprises the steps that the server obtains constraint time according to training information of a terminal, wherein the constraint time is used for enabling the terminal to complete model training within the constraint time; the terminal obtains training parameters according to the constraint time; the terminal completes model training according to the training parameters, and the training parameters are used for enabling the terminal to consume minimum energy consumption when completing model training. According to the federated learning method, on one hand, the completion time of each round of training of the terminal is controlled according to the real-time state of the terminal, and the overall training time is shortened; and on the other hand, the terminal calculates the training speed with the lowest energy consumption according to the training completion time predicted by the server, so that the power consumption of each round of training is reduced.

Description

technical field [0001] The invention belongs to the technical field of machine learning, and specifically relates to a federated learning method, a storage medium, a terminal, a server, and a federated learning system. Background technique [0002] In recent years, mobile devices have developed rapidly. Not only has their computing power been rapidly improved, but they are also equipped with various sensors that collect information from different dimensions and record people's lives. Effective use of these data for machine learning model training will greatly improve the intelligent development of mobile devices, and will also greatly improve user experience. The traditional training method will first collect the user's initial data on the mobile device to the server, and then train the relevant deep learning model. When the application has speculative needs, it uploads the initial data to the server, performs the inference process on the server, and then the server sends t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N20/20
CPCG06N20/20
Inventor 栗力须成忠
Owner SHENZHEN INST OF ADVANCED TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products