Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

quantum optimization parameter adjustment method for distributed deep learning under a Spark framework

A deep learning and distributed technology, applied in neural learning methods, biological models, instruments, etc., can solve problems such as lack of prior knowledge, insufficient data samples, lack of truly valuable data, etc., to avoid prior knowledge Effect

Active Publication Date: 2019-06-11
ZHEJIANG UNIV OF TECH
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In practical applications, we often face problems such as insufficient data samples, lack of truly valuable data, and lack of prior knowledge.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • quantum optimization parameter adjustment method for distributed deep learning under a Spark framework
  • quantum optimization parameter adjustment method for distributed deep learning under a Spark framework
  • quantum optimization parameter adjustment method for distributed deep learning under a Spark framework

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] Referring to the accompanying drawings, this embodiment describes the present invention by taking cardiovascular and cerebrovascular diseases as the specific application field. The traditional deep learning model includes three links: data collection, deep learning training and model evaluation. On the basis of the above, the present invention boils down parameter tuning to finding optimal parameters, and adds a quantum optimization tuning link, such as figure 1 shown. The process flow of the quantum optimization tuning method for distributed deep learning under the Spark framework is as follows: figure 2 As shown, the detailed steps are as follows:

[0052] Step1: Collect data and perform preprocessing and grouping, which specifically includes the following four steps:

[0053] Step1.1: Extract the data of patients related to cardiovascular and cerebrovascular diseases from massive medical data, and perform distributed storage based on the distributed file system (H...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a quantum optimization parameter adjustment method for distributed deep learning under a Spark framework. The method comprises the following steps of data is collected, preprocessed and grouped and generative adversarial network structure parameters are determined; deep neural network is preliminary constructed, including, number of layers,, node number of each layer, weight and learning rate and the Spark master node broadcasts the parameter information to the slave node; a generative adversarial network model is trained in a data parallel mode; initializing Spark-based distributed quantum particle swarm optimization algorithm parameter information; and each slave node performs distributed independent evolution, summarizes the evolutions through the master node, calculates a fitness function value corresponding to each particle according to the individual optimal and global optimal update connection weights in the previous iteration, and evaluates the performance of the deep learning model. The invention can provide a reference method for how to find the optimal parameter for the distributed deep learning model, and can avoid the problems of priori knowledge required by manual parameter adjustment of the deep learning model and low efficiency.

Description

technical field [0001] The present invention relates to a digital computing device or a data processing device or a data processing method suitable for a specific function, in particular to a quantum optimization parameter adjustment method for distributed deep learning under the Spark framework. Background technique [0002] With the advent of the era of big data, artificial intelligence has developed rapidly. As an important branch of artificial intelligence, machine learning, represented by deep learning, has attracted widespread attention. In practical applications, problems such as insufficient data samples, lack of truly valuable data, and lack of prior knowledge are often faced. Therefore, deep learning has gradually shown its limitations, especially in the problem of relying on large-scale labeled data and a large amount of prior knowledge to adjust parameters. How to adjust parameters to improve the performance of deep learning models has become the current deep lea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06Q10/04G06Q50/06G06N3/00G06N3/08
Inventor 王万良张兆娟郑建炜高楠赵燕伟吴菲骆俊锦
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products