Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

DNN model training method and device

A model training and data set technology, applied in the field of machine learning, can solve the problems of unfavorable business promotion, long process, low prediction accuracy, etc.

Inactive Publication Date: 2019-11-26
SUNING CLOUD COMPUTING CO LTD
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The effectiveness of the DNN model depends on the quality of the training data set, including the quantity and quality of labels for a single user, as well as the number of users. Usually, a single company’s own data is limited, and it is difficult to form a high-quality training set, resulting in The model is not effective enough and the prediction accuracy is low
If the data sets of two companies are used for DNN model training, there are two current methods: the first is to pool data to one of the parties for training, this method will inevitably leak the data of one party to the other party, causing many companies to The scheme is more contradictory; the second is to gather the data of both parties into a neutral third-party environment for training. The problem with this method is that it needs to have enough trust in the third party, trust that it will not leak the data of both parties, and sign a relevant agreement. contract, the process is long, which is not conducive to business promotion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • DNN model training method and device
  • DNN model training method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0051] see figure 1 , the present embodiment provides a DNN model training method, comprising: obtaining a first data set and a second data set, respectively performing unified dimension expansion on the dimension vectors in the first data set and the second data set; setting the structure of the DNN model parameters, and initialize the linear relationship coefficient matrix W of each level in the first data set L and bias vector b L ; Use the linear relationship coefficient matrix W corresponding to the second level above L and bias vector b L Convert the data in the first data set and the second data set respectively; convert and superimpose the data in the first data set and the second data set, traverse each element in it, perform forward propagation calculations, and output the results of each neuron in the second level; Based on the results of each neuron in the second level, the forward propagation calculation is performed and the results of each neuron in the third ...

Embodiment 2

[0081] This embodiment provides a DNN model training device, including:

[0082] A data acquisition unit, configured to acquire a first data set and a second data set, and respectively perform unified dimension expansion on the dimension vectors in the first data set and the second data set;

[0083] The initialization unit is used to set the structural parameters of the DNN model, and initialize the linear relationship coefficient matrix W of each level in the first data set L and bias vector b L ;

[0084] A data conversion unit, configured to use the linear relationship coefficient matrix W corresponding to the above-mentioned second level L and bias vector b L respectively converting the data in the first data set and the second data set;

[0085] The first calculation unit is configured to continue traversing the superimposed elements in the next level based on the results of each neuron in the second level, and correspondingly output the results of each neuron in the...

Embodiment 3

[0094] This embodiment provides a computer-readable storage medium. A computer program is stored on the computer-readable storage medium. When the computer program is run by a processor, the steps of the above DNN model training method are executed.

[0095] Compared with the prior art, the beneficial effect of the computer-readable storage medium provided by this embodiment is the same as the beneficial effect of the DNN model training method provided by the above technical solution, and will not be repeated here.

[0096] Those of ordinary skill in the art can understand that all or part of the steps in the above inventive method can be completed by instructing related hardware through a program. The above program can be stored in a computer-readable storage medium. When the program is executed, it includes: For each step of the method in the above embodiments, the storage medium may be: ROM / RAM, magnetic disk, optical disk, memory card, and the like.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a DNN model training method and device, relates to the technical field of machine learning, and can realize joint training of original data sets of companies of two parties onthe premise of not leaking the original data sets of the companies of the two parties. The method comprises the following steps: respectively carrying out unified dimension expansion on dimension vectors in a first data set and a second data set; setting structural parameters of the DNN model; respectively converting data in the first data set and the second data set by using the linear relation coefficient matrix WL and the bias vector bL corresponding to the second hierarchy; performing traversal calculation on each neuron result from the second level to the L level through a loss function and a back propagation function; and correspondingly outputting when the variation of the linear relation coefficient matrix WL and the bias vector bL of each level in the current iteration relative to the last iteration is smaller than an iteration stopping threshold value, otherwise, returning to the data conversion step to re-execute iterative computation until the number of iterations isgreater than the maximum number of iterations.

Description

technical field [0001] The invention relates to the technical field of machine learning, in particular to a DNN model training method and device. Background technique [0002] Enterprises usually have a large amount of data, and each company protects its own data as a core asset to prevent leakage. When companies use their own user data, they usually need to predict whether a user will perform a certain behavior based on the user's label data, for example, According to the user profile, predict whether the user will click on a promotional advertisement. The industry uses various prediction models to predict user behavior, among which DNN is a prediction model with a wide range of applications. [0003] The effectiveness of the DNN model depends on the quality of the training data set, including the quantity and quality of labels for a single user, as well as the number of users. Usually, a single company’s own data is limited, and it is difficult to form a high-quality trai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 姚平韩松江徐杰李蒙
Owner SUNING CLOUD COMPUTING CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products