Check patentability & draft patents in minutes with Patsnap Eureka AI!

Cloud computing application memory management method based on real-time content prediction and historical resource occupation

A technology for memory management and historical resources, applied in the field of cloud computing application memory management

Pending Publication Date: 2022-05-10
北京广通优云科技股份有限公司 +1
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The present invention aims at the application program memory management problem in the cloud environment, and realizes the real-time prediction of the memory at the next moment based on the memory occupation sequence data of the fixed time window of the application program, combined with the statistical records of historical resource usage in the entire life cycle of the application program, through the reinforcement learning model Realizing an integrated automatic memory reclamation method for cloud computing applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cloud computing application memory management method based on real-time content prediction and historical resource occupation
  • Cloud computing application memory management method based on real-time content prediction and historical resource occupation
  • Cloud computing application memory management method based on real-time content prediction and historical resource occupation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The present invention will be described in detail below in conjunction with accompanying drawing:

[0023] Such as figure 1 As shown, the present invention proposes a cloud computing application memory management method based on real-time content prediction and historical resource occupancy, including the following steps:

[0024] a) Applications in a given cloud computing environment . Checkpoint at fixed time interval T, logging the application The memory usage of the past n checkpoints. application The memory usage record at the tth checkpoint (hereinafter referred to as t time) for:

[0025]

[0026] b) Based on the long-short-term memory network LSTM (known algorithm), the input is the application program at time t memory usage record of , the output is the application at time t+1 Memory usage prediction for :

[0027]

[0028] In each subsequent iterative operation, at a given time t, based on the memory usage records of the previous n chec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of cloud environments, in particular to a cloud computing application memory management method based on real-time content prediction and historical resource occupancy, which realizes the real-time prediction of the memory at the next moment based on the memory occupancy time sequence data of a fixed time window of an application program, and combines the historical resource usage statistical record of the full life cycle of the application program to realize the real-time prediction of the memory at the next moment. The integrated cloud computing application automatic memory recovery method is realized through a reinforcement learning model. The method has the advantages that real-time memory usage prediction and historical resource usage statistical records are combined, an integrated application memory scaling scheme is provided, and the running efficiency of application programs in the cloud computing environment is improved under the condition that memory errors are avoided.

Description

technical field [0001] The invention relates to the field of cloud computing, and mainly relates to a cloud computing application memory management method based on real-time content prediction and historical resource occupation. Background technique [0002] With the maturity of big data and cloud computing technologies, enterprises' demand for high-performance computing for big data processing continues to increase. Various applications are deployed on the cloud to achieve efficient parallel computing and on-demand allocation of computing resources. But in the face of the unlimited increase in the amount of data, the number of applications, and the complexity of computing tasks, physical resources are ultimately limited. [0003] In the cloud computing environment, the memory management of applications has always been a difficult problem in the industry. Although the emergence of many big data computing platform systems has solved the technical problems of parallel comput...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/02G06N3/08
CPCG06F12/0253G06N3/08
Inventor 刘东海徐育毅庞辉富
Owner 北京广通优云科技股份有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More