Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning method, storage medium, terminal, server and federal learning system

A learning method and storage medium technology, applied in integrated learning and other directions, can solve problems such as inability to effectively take into account terminal power consumption and overall training time, inability to analyze data on mobile terminals, and violation of data analysis, so as to reduce overall training time and reduce training. time, and the effect of improving training efficiency

Pending Publication Date: 2021-05-28
SHENZHEN INST OF ADVANCED TECH
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, model training is usually a computationally intensive task, which typically incurs significant power consumption on mobile devices
In order to solve this problem, Google's existing solution is to train the model only when the mobile device is in the charging state and is used for the period of time when there is no interaction with the mobile device, but this seriously violates the original intention of using mobile devices for data analysis. Unable to analyze mobile data in a timely and extensive manner
That is, the existing solutions cannot effectively balance the power consumption of the terminal and the overall training time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning method, storage medium, terminal, server and federal learning system
  • Federal learning method, storage medium, terminal, server and federal learning system
  • Federal learning method, storage medium, terminal, server and federal learning system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0066] Such as figure 1 As shown, the federated learning system according to Embodiment 1 of the present invention includes a server 10 and several terminals 20, and the server 10 and the terminals 20 can communicate with each other. The basic workflow of the federated learning system is: each terminal 20 sends its own training information to the server 10, the server 10 obtains the constraint time according to the training information of each terminal 20, the terminal 20 obtains the training parameters according to the constraint time, and the terminal 20 obtains the training parameters according to the specified time limit. The above training parameters are used to complete the model training. The constraint time is used to enable the terminal to complete the model training within the constraint time, and the training parameters are used to enable the terminal to consume the minimum energy consumption when completing the model training. In this way, the federated learning s...

Embodiment 2

[0069] Such as figure 2 As shown, the federated learning method in the second embodiment includes the following steps:

[0070] Step S10: the server 10 receives the training information sent from the terminal.

[0071] Before starting training each time, the server 10 needs to collect training information of each terminal 20 in advance. The training information includes hardware information and a preset amount of training data, wherein the hardware information includes information such as the processor information of the terminal, mainly refers to information such as CPU frequency and CPU cycle, and the preset amount of training data refers to information such as Contains the number of training data, for example, for image recognition tasks, the number of training data refers to how many pictures are contained in the terminal.

[0072] Step S20: the server 10 acquires a constraint time according to the training information, and the constraint time is used to enable the term...

Embodiment 3

[0097] Embodiment 3 discloses a computer-readable storage medium, where a federated learning program is stored in the computer-readable storage medium, and when the federated learning program is executed by a processor, the federated learning method as described in Embodiment 2 is implemented.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a federated learning method, a storage medium, a server, a terminal and a federated learning system. The federated learning method comprises the steps that the server obtains constraint time according to training information of a terminal, wherein the constraint time is used for enabling the terminal to complete model training within the constraint time; the terminal obtains training parameters according to the constraint time; the terminal completes model training according to the training parameters, and the training parameters are used for enabling the terminal to consume minimum energy consumption when completing model training. According to the federated learning method, on one hand, the completion time of each round of training of the terminal is controlled according to the real-time state of the terminal, and the overall training time is shortened; and on the other hand, the terminal calculates the training speed with the lowest energy consumption according to the training completion time predicted by the server, so that the power consumption of each round of training is reduced.

Description

technical field [0001] The invention belongs to the technical field of machine learning, and specifically relates to a federated learning method, a storage medium, a terminal, a server, and a federated learning system. Background technique [0002] In recent years, mobile devices have developed rapidly. Not only has their computing power been rapidly improved, but they are also equipped with various sensors that collect information from different dimensions and record people's lives. Effective use of these data for machine learning model training will greatly improve the intelligent development of mobile devices, and will also greatly improve user experience. The traditional training method will first collect the user's initial data on the mobile device to the server, and then train the relevant deep learning model. When the application has speculative needs, it uploads the initial data to the server, performs the inference process on the server, and then the server sends t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N20/20
CPCG06N20/20
Inventor 栗力须成忠
Owner SHENZHEN INST OF ADVANCED TECH
Features
  • Generate Ideas
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More