Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning model training method for large-scale industrial chain privacy calculation

A learning model and industry chain technology, applied in computing models, computing, machine learning, etc., can solve problems such as reducing the accuracy of the federated learning model and affecting the training effect of the federated learning model, so as to improve service quality, operational efficiency, and high accuracy , the effect of reducing the degree of weight dispersion

Pending Publication Date: 2022-03-11
BEIJING UNIV OF POSTS & TELECOMM +1
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the existing technology applies federated learning to the industrial chain network system, due to the existence of some low-efficiency worker learning nodes, it affects the training effect of the entire federated learning model and reduces the accuracy of the federated learning model.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning model training method for large-scale industrial chain privacy calculation
  • Federal learning model training method for large-scale industrial chain privacy calculation
  • Federal learning model training method for large-scale industrial chain privacy calculation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0085] In this embodiment, for the threat model, the present invention provides an industrial chain-oriented multi-party collaborative federated learning model, such as figure 2 shown. The purpose of the multi-party collaborative computing committee is to help task issuers train a federated learning model through federated learning The federated learning worker will take the locally trained model sent to the data requester. On the one hand, in order to achieve fair and efficient federated learning, the present invention provides an incentive mechanism based on workers' data sharing rate and computing resource sharing rate to distribute rewards, so as to maximize the profit of model users. Optimize the performance of federated learning under specific resource constraints. On the other hand, in order to achieve reliable and robust federated learning, during the aggregation step, the data proxy node will eliminate the model with a large difference between the class distribu...

Embodiment 2

[0087] In this embodiment, the present invention also provides a federated learning communication model and computing model, including: considering a group of devices with federated computing capabilities The computing power (that is, the CPU cycle frequency) is expressed as f n , the number of CPU cycles required to train the local model is c n ,q n Represents the sample data size, so the calculation time of one iteration of worker n is expressed by the following formula:

[0088]

[0089] in, is the effective capacitance parameter of worker n chipset. Express the transfer rate of federated learning parameters as where B is the transmission bandwidth, ρ n is the transmission power of worker n, h n is the channel gain of the point-to-point link between the federated learning worker node and the federated learning center server node, N 0 is possible noise. use ε n Indicates the quality of the local model trained by the worker, which mainly depends on the contri...

Embodiment 3

[0099] In this embodiment, in order to perform reliable federated learning tasks and promote data sharing transactions, it is necessary for task publishers to reward nodes that perform well in training tasks so that they can be more active when future tasks come. Contribute data and train models of good quality. In the model transaction of federated learning collaborative computing, two stakeholders need to be considered, namely, the profit of the task issuer (also the model user) and the utility of the federated learning workers (model provider).

[0100] For the task issuer, it needs to design a personalized contract for each data type of worker, and each task issuer will pay the contributing federated learning workers to meet their own needs. Denote the contract signed by the task issuer and worker n as {R n (q n ),q n},R n Is the reward factor for the worker, including a series of reward packages for the worker's computing resource supply rate and data supply rate. It...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a federal learning model training method for large-scale industrial chain privacy calculation. The method comprises the steps that a federated learning center server node divides industrial chain business training data sets with non-independent identical distribution as a target, and distributes the industrial chain business training data sets to a plurality of federated learning worker nodes; the federated learning worker node iteratively trains the local model based on the target profit function; after aggregating the updated local model weight sent by each federated learning worker node, the aggregation server node calculates an earth movement distance corresponding to each federated learning worker node according to each local model weight distribution and the overall local model weight distribution, and removes federated learning worker nodes exceeding a preset distance threshold; and the remaining federal learning worker nodes continue to carry out model training. According to the method, data distribution with too large distribution difference is eliminated in the federated learning model training process, the precision loss caused by heterogeneous data is reduced, and the application reliability of a traditional algorithm in an industrial chain is improved.

Description

technical field [0001] The invention relates to the technical field of federated learning, in particular to a federated learning model training method for large-scale industrial chain privacy calculation. Background technique [0002] The industrial chain is a concept that includes four dimensions: value chain, enterprise chain, supply and demand chain, and space chain. The industrial chain is formed in the process of balancing these four dimensions with each other. There are a large number of upstream and downstream relationships and exchanges of mutual value in the industrial chain. The upstream link transmits products or services to the downstream link, and the downstream link feeds back information to the upstream link to adjust the strategy. The information interaction requirements in the industry chain form a large-scale network topology, which faces severe privacy barriers and security challenges, and also means that the distributed privacy computing tasks of the indu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N20/00
CPCG06N20/00G06F18/214
Inventor 郭少勇陈浩黄建平颜拥陈洁蔚黄徐川韩嘉佳孙歆姚影杨国铭
Owner BEIJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products