Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

1590results about "Communication with homomorphic encryption" patented technology

A combined deep learning training method based on a privacy protection technology

The invention belongs to the technical field of artificial intelligence, and relates to a combined deep learning training method based on a privacy protection technology. The efficient combined deep learning training method based on the privacy protection technology is achieved. In the invention, each participant first trains a local model on a private data set to obtain a local gradient, then performs Laplace noise disturbance on the local gradient, encrypts the local gradient and sends the encrypted local gradient to a cloud server; The cloud server performs aggregation operation on all thereceived local gradients and the ciphertext parameters of the last round, and broadcasts the generated ciphertext parameters; And finally, the participant decrypts the received ciphertext parameters and updates the local model so as to carry out subsequent training. According to the method, a homomorphic encryption scheme and a differential privacy technology are combined, a safe and efficient deep learning training method is provided, the accuracy of a training model is guaranteed, and meanwhile a server is prevented from inferring model parameters, training data privacy and internal attacksto obtain private information.
Owner:UNIV OF ELECTRONICS SCI & TECH OF CHINA

Multi-center block chain transaction privacy protection system and method

The invention discloses a multi-center block chain transaction privacy protection system and method. The system comprises an alliance control module, an amount verification module, a range verification module, an encryption module, a decryption module and a block chain system transaction module, wherein the alliance control module is used for generating alliance parameters by multiple participants; the amount verification module is used for verifying that input and output of an encrypted ciphertext amount in a transaction are equal; the range verification module is used for verifying that theencrypted ciphertext amount in the transaction is in a specific interval and is constantly positive; the encryption module and the decryption module are used for carrying out homomorphic encryption and decryption on the amount in transmission and reception processes; and the block chain system transaction module is used for complete bitcoin-like digital currency transaction systems, and has a complete transaction process which comprises transmission, reception, broadcasting and block confirmation. The system is capable of enhancing general structures through block chain transaction privacies under a multi-center supervision mode, so as to realize privacy protection for trapdoor parameters under joint control of multiple parties and transaction metadata in transaction process, and effectively strengthen the safety of plaintext amounts in multi-center block chain system transaction process.
Owner:BEIHANG UNIV

Federated learning training data privacy enhancement method and system

The invention discloses a federated learning training data privacy enhancement method and system, and the method comprises the steps that a first server generates a public parameter and a main secretkey, and transmits the public parameter to a second server; a plurality of clients participating in federated learning generate respective public key and private key pairs based on the public parameters; the federated learning process is as follows: each client encrypts a model parameter obtained by local training by using a respective public key, and sends the encrypted model parameter and the corresponding public key to a first server through a second server; the first server carries out decryption based on the master key, obtains global model parameters through weighted average, carries outencryption by using a public key of each client, and sends the global model parameters to each client through the second server; and the clients carry out decrypting based on the respective private keys to obtain global model parameters, and the local models are improved, and the process is repeated until the local models of the clients converge. According to the method, a dual-server mode is combined with multi-key homomorphic encryption, so that the security of data and model parameters is ensured.
Owner:UNIV OF JINAN

Privacy-protecting multi-party deep learning computation proxy method under cloud environment

The invention belongs to the technical field of cloud computing, and is to achieve data sharing under the premise of protecting privacy and deep learning application on the basis of data sharing. Thetechnical scheme adopted by the method is a privacy-protecting multi-party deep learning computation proxy method under a cloud environment. Each participant runs a deep learning algorithm based on arespective data set to compute a gradient parameter value, and uploads the gradient parameter that is encrypted by a multiplicative homomorphic ElGamal encryption scheme to a server; when uploading the gradient parameter to the cloud server, the participant simultaneously generates the signature of the parameter, and the signature meets polymerization, that is to say the cloud server can compute the gradient parameter and the signature; the cloud computing server computes the sum of gradient parameters of all users on a ciphertext, and returns the result back to the user, and the user acquiresthe final gradient parameter sum after decrypting, and verifies validity of the sum through checking whether the result and the polymerized signature are the effective message and signature. The method provided by the invention is mainly applied to cloud computing occasions.
Owner:QUFU NORMAL UNIV

Model parameter training method and device based on federated learning, equipment and medium

The invention discloses a model parameter training method and device based on federal learning, equipment and a medium. The method comprises the following steps: when a first terminal receives encrypted second data sent by a second terminal, obtaining a corresponding loss encryption value and a first gradient encryption value; randomly generating a random vector with the same dimension as the first gradient encryption value, performing fuzzy on the first gradient encryption value based on the random vector, and sending the fuzzy first gradient encryption value and the loss encryption value toa second terminal; when the decrypted first gradient value and the loss value returned by the second terminal are received, detecting whether the model to be trained is in a convergence state or not according to the decrypted loss value; and if yes, obtaining a second gradient value according to the random vector and the decrypted first gradient value, and determining the sample parameter corresponding to the second gradient value as the model parameter. According to the method, model training can be carried out only by using data of two federated parties without a trusted third party, so thatapplication limitation is avoided.
Owner:WEBANK (CHINA)

Efficient Implementation Of Fully Homomorphic Encryption

In one exemplary embodiment of the invention, a method for homomorphic decryption, including: providing a ciphertext with element c, there exists a big set B having N elements zi so B={z1,z2, . . . , zN}, there exists a small set S having n elements sj so S={s1, s2, . . . , sn}, the small set is a subset of the big set, summing up the elements of the small set yields the private key, there exists a bit vector {right arrow over (σ)} having N bits σi so {right arrow over (σ)}=σ1, σ2, . . . , σN, σi=1 if zi ∈ S else σi=0, there exists an encrypted vector {right arrow over (d)} having N ciphertexts di so d=d1, d2, . . . , dN, di is an encryption of σi; post-processing c by multiplying it by all zi to obtain an intermediate vector {right arrow over (y)}=y1, y2, . . . , yN with yi computed yi=c×zi; homomorphically multiplying yi by di obtaining a ciphertext vector {right arrow over (x)} having N ciphertexts xi so z=x1, x2, . . . , xN, where xi is an encryption of the product yi·σi; and homomorphically summing all xi to obtain a resulting ciphertext that is an encryption of the at least one bit, where the big set is partitioned into n parts with each part having a plurality of different elements from the big set, where the elements of the small set are one element from each part.
Owner:IBM CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products