Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Server resource prediction method and device, computer equipment and storage medium

A technology for server resources and forecasting methods, applied in the field of intelligent decision-making, which can solve problems such as the inability to accurately judge capacity requirements based on experience

Pending Publication Date: 2019-07-26
ONE CONNECT SMART TECH CO LTD SHENZHEN
View PDF0 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Embodiments of the present invention provide a server resource prediction method, device, computer equipment, and storage medium, aiming to solve the problem of current actual usage and historical actual usage of storage and CPU of known cloud servers in the prior art. Unable to accurately judge future capacity requirements based on experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Server resource prediction method and device, computer equipment and storage medium
  • Server resource prediction method and device, computer equipment and storage medium
  • Server resource prediction method and device, computer equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0030] It should be understood that when used in this specification and the appended claims, the terms "comprising" and "comprises" indicate the presence of described features, integers, steps, operations, elements and / or components, but do not exclude one or Presence or addition of multiple other features, integers, steps, operations, elements, components and / or collections thereof.

[0031] It should also be understood that the terminology used ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a server resource prediction method and device, computer equipment and a storage medium. The method comprises the following steps: taking a user selected in a user list as a target user, and acquiring historical performance parameters correspondingly consumed by the target user by using a cloud server in a preset historical time period according to a preset acquisition period, so as to obtain a historical performance parameter set corresponding to the target user; performing model training on the to-be-trained back propagation neural network according to the historicalperformance parameter set to obtain a back propagation neural network for predicting performance parameter values; correspondingly obtaining a current input sequence according to the historical performance parameter set and the received to-be-predicted time point; and inputting the current input sequence into the back propagation neural network to obtain a predicted value corresponding to the to-be-predicted time point. According to the method, a server resource prediction model is established by using historical data, and the future usage amount of cloud server resources is predicted.

Description

technical field [0001] The invention relates to the technical field of intelligent decision-making, in particular to a server resource prediction method, device, computer equipment and storage medium. Background technique [0002] At present, an enterprise has built a cloud server, and the cloud server can dynamically allocate a certain amount of storage and CPU to a certain team of the enterprise according to the actual use needs of users (such as internal R&D and testing use of the team, etc.). The storage and CPU (i.e. database and application host) of the cloud server are charged according to the actual usage of storage and CPU. The work experience of the project leader of the team alone cannot accurately judge the future capacity demand, and the large amount of resources cannot keep up with All project leaders check one by one, which not only affects the progress of the project, but also is not conducive to planning the procurement of cloud servers. Contents of the in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06Q10/04G06Q10/06G06N3/08G06N3/04
CPCG06Q10/04G06Q10/0639G06N3/084G06N3/045
Inventor 余海燕
Owner ONE CONNECT SMART TECH CO LTD SHENZHEN
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products