Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Federal learning data poisoning attack-oriented defense method and device

A data and federation technology, applied in the field of network security, can solve problems such as destroying the availability of shared models, and achieve the effect of high accuracy

Active Publication Date: 2022-01-21
HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
View PDF3 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The main purpose of the present invention is to overcome the defects in the prior art, provide a defense method and device for federated learning data poisoning attacks, and solve the problem of malicious clients The security problem of destroying the availability of shared models by tampering with local data (data poisoning attack)

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning data poisoning attack-oriented defense method and device
  • Federal learning data poisoning attack-oriented defense method and device
  • Federal learning data poisoning attack-oriented defense method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055] In order to enable those skilled in the art to better understand the solutions of the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application. Apparently, the described embodiments are only some of the embodiments of this application, not all of them. Based on the embodiments in this application, all other embodiments obtained by those skilled in the art without making creative efforts belong to the scope of protection of this application.

[0056] The defense method for federated learning data poisoning attacks in this application is improved based on a common federated learning process, and the improved learning process can resist data poisoning attacks from malicious clients. see figure 1 , figure 2 , this embodiment provides a defense method for federated learning data poisoning attacks, including the fol...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a federal learning data poisoning attack-oriented defense method and device. The method comprises the steps that each client uses local data to train model parameters; each client uploads local model parameters to the server, and the server receives all the model parameters; the server calculates a reference basis u for comparison, and for any two local models wa and wb, the similarity of the two local models wa and wb relative to the reference basis u is calculated; whether a local model is malicious or not is judged by adopting an internal voting method; the credibility of each local model is calculated according to the votes obtained by each local model; and model weighted aggregation is carried out based on credibility to obtain a final global model, and defense of data poisoning attack is achieved based on the final global model. According to the method, the model of the malicious client can be endowed with a lower weight, and the influence of the malicious client on the global model is weakened during weighted aggregation, so that defense against data poisoning attacks is realized.

Description

technical field [0001] The invention belongs to the technical field of network security, and in particular relates to a defense method and device for federated learning data poisoning attacks. Background technique [0002] The development and application of machine learning is based on the collection and analysis of big data. However, the data from a single data source often has the problem of small data volume and few features, so it is not enough to train a model with good performance. The federated learning proposed by Google is a special distributed machine learning framework. The framework sends the training model to different clients. Different clients use local data to train the layout model, and then upload the trained local model to the center. Server, the central server aggregates the models to generate the final global model. This training method allows joint training of shared models with multi-source data, while keeping the data locally on the client to ensure...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L9/40G06N20/00
CPCH04L63/1466H04L63/1416G06N20/00
Inventor 刘洋田宇琛张伟哲王轩漆舒汉夏文唐琳琳张加佳吴宇琳
Owner HARBIN INST OF TECH SHENZHEN GRADUATE SCHOOL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products