Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Storage system self-adaptive parameter tuning method based on deep learning

An adaptive parameter and storage system technology, applied in neural learning methods, electrical digital data processing, program startup/switching, etc. The effect of improving performance, improving performance and quality of service

Pending Publication Date: 2020-10-09
HANGZHOU DIANZI UNIV
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Many optimization techniques are difficult to effectively deal with discrete and categorical parameters, which ultimately makes performance optimization results very unsatisfactory
[0006] The current parameter configuration and tuning methods cannot distinguish the load characteristics of specific applications, and cannot be used in different operating environments. The generality is poor, making it impossible to ensure the service quality of applications in the storage system.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Storage system self-adaptive parameter tuning method based on deep learning
  • Storage system self-adaptive parameter tuning method based on deep learning
  • Storage system self-adaptive parameter tuning method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described below in conjunction with accompanying drawing, please refer to figure 1 . figure 1 The architecture of the I / O bandwidth scheduling model based on the WFQ algorithm proposed by the present invention is given. After the I / O load is extracted from the application, the I / O load is classified according to different applications and sent to The corresponding I / O request queue waits for service, and then reorders the I / O load requests queued in the queue according to the scheduler, and finally sends them to the underlying storage entities such as disks or disk arrays. The WFQ scheduling algorithm adopted by the scheduler is divided into three parts, which are load I / O flow isolation distribution, I / O request identification and I / O request dispatching. Load I / O flow isolation allocation adopts a dynamic mechanism to distinguish between normal I / O flows and abnormal I / O flows, and at the same time carry out weighted and fair dis...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a storage system self-adaptive parameter tuning method based on deep learning. The method comprises the steps of I / O bandwidth management of the storage system, I / O load feature recognition and system parameter self-adaptive adjustment and optimization, wherein the I / O bandwidth management is that the bandwidth is allocated through a reinforced fair queuing scheduling algorithm, and an application program can obtain reasonable bandwidth resources. The I / O load feature identification is to perform performance modeling on the storage system by means of a multiple linear regression theory, and dynamically extract a performance model of the storage system by detecting load features. According to the system parameter self-adaptive adjustment and optimization, parametershaving large influence on the performance of the storage system are selected through a Latin hypercube sampling method and a greedy parameter selection algorithm, then an optimization model is trainedthrough a deep neural network, and parameter configuration with the optimal system performance is obtained. The I / O bandwidth can be reasonably managed, load characteristics are detected and identified, important parameters are automatically selected, and the performance of the storage system is optimized.

Description

technical field [0001] The invention relates to a method for optimizing the performance of a storage system, in particular to a computer storage system with large-scale load deployment, such as a method for optimizing load characteristics and parameter adaptation in a large data storage system. Background technique [0002] Storage systems are key components of modern computer systems and have a significant impact on application performance and efficiency. Most storage systems have many configurable parameters to control and affect application performance. However, with the rapid development of computer applications and network technology and the continuous improvement of the system structure, the scale of the storage system is becoming very large and the distribution is becoming wider and wider. The software and hardware implementation of the storage system has also become more complex, which contains hundreds or thousands of configurable parameters. Highly complex and hig...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06F9/48G06N3/063G06N3/04G06N3/08
CPCG06F9/5011G06F9/4881G06N3/063G06N3/08G06N3/045
Inventor 陈圣蕾蒋从锋欧东阳殷昱煜张纪林闫龙川黄震赵子岩李妍
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products