Check patentability & draft patents in minutes with Patsnap Eureka AI!

Model training method and device based on block chain and federal learning, and block chain

A model training, block chain technology, applied in the field of block chain, can solve the lack of, can not meet the individual needs of equipment and other problems, to achieve the effect of improving accuracy, realizing information traceability and anti-tampering, and ensuring data privacy

Pending Publication Date: 2022-04-19
海南火链科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This problem can be avoided by using federated learning, but the model trained according to federated learning cannot meet the individual needs of each device
Therefore, in the speech recognition scenario, there is a lack of a model training method that not only takes into account information security, but also adapts to the individual needs of users.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model training method and device based on block chain and federal learning, and block chain
  • Model training method and device based on block chain and federal learning, and block chain
  • Model training method and device based on block chain and federal learning, and block chain

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] In order to make the purpose, technical solution and advantages of the present application clearer, the technical solution of the present application will be clearly and completely described below in conjunction with specific embodiments of the present application and corresponding drawings. Apparently, the described embodiments are only some of the embodiments of the present application, rather than all the embodiments. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0055] The technical solutions provided by various embodiments of the present application will be described in detail below in conjunction with the accompanying drawings.

[0056] The idea of ​​this application is to provide a blockchain and federated learning based The model training method enables multiple participants to train and deploy a persona...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a model training method and device based on a block chain and federated learning, and the block chain, the block chain comprises a plurality of idle edge nodes, one of the idle edge nodes is an aggregation node, and the others are training nodes; the method comprises the steps that an aggregation node generates an initial global initial model; after each training node downloads, training based on local voice data to obtain a local voice recognition model, and transmitting the local voice recognition model to an aggregation node, so that the aggregation node performs aggregation according to an aggregation algorithm to obtain a global voice recognition model as a global initial model; when the global speech recognition model meets a preset iteration condition, the global speech recognition model is downloaded by each training node; and each idle edge node calculates parameters required by secondary training based on the local voice data and uploads the parameters to the block chain, secondary training is performed according to the local voice data and the parameters, and a global voice recognition model is deployed. According to the method, the security of user privacy data is ensured, the model obtained by federal learning training is subjected to personalized transformation, and the use precision of the local deployment model is improved.

Description

technical field [0001] This application relates to the field of computer technology, in particular to a model training method, device and blockchain based on blockchain and federated learning. Background technique [0002] Speech recognition technology focuses on converting the lexical content of human speech into computer-readable input. The long-term development of speech recognition technology is inseparable from the support of speech data. [0003] At this stage, there are many public speech datasets. However, with the progress of the times and changes in people's thinking, there will always be new words being created continuously, and some old words will also derive new meanings. Therefore, a large amount of fresh speech data is needed to continuously support the development of speech recognition technology. [0004] When collecting and updating voice data, grabbing data directly from the user's local data set will involve the issue of privacy violations. This probl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06F16/27G06N20/20
CPCG06F16/27G06N20/20G06F18/214
Inventor 李慧王旭蕾廖德强
Owner 海南火链科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More