Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method and device of longitudinal federal learning model and computer equipment

A technology for learning models and equipment, applied in the fields of computer, big data, and deep learning, it can solve problems such as affecting the performance of the global model, inability to play, and node attacks by participants, so as to improve the ability to resist attacks, improve security, and improve resistance The effect of the ability to attack

Pending Publication Date: 2022-03-25
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the actual application scenario, the participating nodes are likely to be attacked, which greatly affects the performance of the global model
However, the common method of reducing the dimensionality of updated intermediate data to identify malicious users is suitable for non-federated learning frameworks, and cannot play a good role in the scenario of malicious user identification in federated learning.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method and device of longitudinal federal learning model and computer equipment
  • Training method and device of longitudinal federal learning model and computer equipment
  • Training method and device of longitudinal federal learning model and computer equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and they should be regarded as exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.

[0031] It can be understood that federated learning is suitable for large-scale distributed deep learning model training, while providing privacy protection, and building a joint model for local data sets. In the vertical federated learning scenario, the dataset is split vertically and owned by different parties, that is, each party owns a disjoint subset of attribut...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a training method and device for a longitudinal federated learning model, and relates to the field of big data and the technical field of computers such as deep learning. According to the specific implementation scheme, the training method is applied to first participant equipment, and the first participant equipment has label data. The longitudinal federated learning model comprises a first bottom layer sub-model in the first participant device, an interaction layer sub-model, a top layer sub-model based on a Lipschitz neural network and a second bottom layer sub-model in the second participant device. And obtaining first bottom layer output data of the first participant device and second bottom layer output data sent by the second participant device, inputting the first bottom layer output data and the second bottom layer output data to the interaction layer sub-model to obtain interaction layer output data, obtaining top layer output data according to the interaction layer output data and the top layer sub-model, and sending the top layer output data to the interaction layer sub-model. And training a longitudinal federal learning model according to the top output data and the label data. According to the invention, the anti-attack capability of the federal learning system can be improved.

Description

technical field [0001] The present application relates to the field of computer technology, especially to the field of big data and deep learning, and in particular to a training method, device, computer equipment and storage medium for a longitudinal federated learning model. Background technique [0002] Federated learning is suitable for large-scale distributed deep learning model training, while providing privacy protection, and building a joint model for local data sets. In the vertical federated learning scenario, the dataset is split vertically and owned by different parties, that is, each party owns a disjoint subset of attributes, and the goal is to transfer data from one party to another without transferring any data. Collaborative learning machine learning models under the premise of the party. [0003] In related technologies, most of the security and privacy protection schemes in the vertical federated learning scenario focus on solving the privacy leakage prob...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04G06F21/60G06F21/62
CPCG06N3/084G06F21/602G06F21/6245G06N3/044G06N3/045G06N3/098G06N3/09G06N3/048G06N3/04
Inventor 刘吉余孙婕周吉文周瑞璞窦德景
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products