A multi-layer LSTMRNN-based communication capability open load prediction method and device
A technology of load forecasting and communication capabilities, applied in the field of communication, can solve problems such as no-load accurate forecasting schemes, and achieve the effect of accurate load forecasting
Active Publication Date: 2019-06-25
CHINA MOBILE GROUP ZHEJIANG +1
4 Cites 3 Cited by
AI-Extracted Technical Summary
Problems solved by technology
[0004] Through the prediction and analysis of the load of the communication capability open platform, early detection, early warning, and early response to normal load blowouts or abnormal malicious attacks can be made, so that the b...
Method used
It can be understood that, carrying out normalization processing to described load data, refers to that described load data is scaled in proportion, makes it fall in the interval [0,1], will promote model after normalization The convergence speed and the accuracy of the model are improved.
The open load forecasting device based on multilayer LSTMRNN that the embodiment of the present invention provides, by obtaining the load data in the preset time period before the time period to be predicted by the open communication platform, the load data is normalized Transform the normalized load data into a shape suitable for the LSTM neural network, input the transformed load data into the pre-built and trained multi-layer LSTMRNN load forecasting model, and make the output result Inverse normalization processing is performed to obtain the load prediction result of the time...
Abstract
The embodiment of the invention discloses a communication capability open load prediction method and device based on multilayer LSTMRNN, and the method and device can achieve the accurate prediction of the load of a communication capability open platform. The method comprises the following steps: obtaining load data of the communication capability open platform in a preset time period before the to-be-predicted time period, carrying out normalization processing on the load data, converting the load data subjected to normalization processing into a shape suitable for an LSTM neural network, theload data comprising an API call amount per hour and a peak TPS per hour with an hour as a granularity; Inputting the load data of which the shape is transformed into pre-constructed load data; and performing inverse normalization processing on an output result to obtain a load prediction result of the time period to be predicted by using the trained multi-layer LSTMRNN load prediction model, themulti-layer LSTMRNN load prediction model comprising an input layer, an output layer and at least two stacked LSTM hidden layers.
Application Domain
Neural architecturesData switching networks +1
Technology Topic
Data subjectPeak value +7
Image
Examples
- Experimental program(1)
Example Embodiment
[0021] In order to make the objectives, technical solutions, and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are the present invention. Invented some embodiments, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the embodiments of the present invention.
[0022] See figure 1 , This embodiment discloses a method for open load forecasting of communication capabilities based on multi-layer LSTMRNN, including:
[0023] S1. Obtain the load data of the communication capability opening platform in a preset time period before the time period to be predicted, where the load data includes hourly API calls and hourly peak TPS with hourly granularity;
[0024] S2. By inputting the load data into the pre-built and trained multi-layer LSTMRNN load prediction model, the load prediction result of the time period to be predicted is obtained, wherein the multi-layer LSTMRNN load prediction model includes an input layer, an output Layer and at least two stacked LSTM hidden layers.
[0025] It can be understood that performing normalization processing on the load data refers to scaling the load data so that it falls within a small specific interval. Since LSTM is sensitive to the size of the input data, the load data needs to be uniformly mapped to the range of [0,1]. The function MinMaxScaler (feature_range=(0,1)) can be used for normalization. After normalization, the convergence speed and accuracy of the model will be improved.
[0026] Such as figure 2 Shown is a schematic diagram of an embodiment of a multi-layer LSTMRNN load prediction model, figure 2 The described model must return a sequence in the LSTM layer before each LSTM layer, that is, the return_sequences parameter is set to True. The return_sequences parameter means whether or not to output at each point in time. The default is false, and now it is defined as true. If it is equal to false, only one value is output at the last time point. The model uses the activation function sigmoid to input three load data X1, X2, and X3 each time (the number of load data input each time is the number of historical load data that affects the current load data) to obtain an output value Y predicted value. In order to obtain the real load forecast results, the Y forecast value needs to be denormalized.
[0027] In addition, it should be noted that the normalized load data is transformed into a shape suitable for the LSTM neural network. For example, it is assumed that the normalized load data are arranged in chronological order as A, B, C, D, E, F..., the three historical load data affect the current load data, the step size is 1, the load data after the shape change is ((A, B, C), (B, C, D) , (C, D, E), (D, E, F),...), that is, starting from the leftmost data, three consecutive data form one input of the multi-layer LSTMRNN load forecasting model, and the next input is The length corresponding to the previous input shift step to the right is 1 unit, for example, (B, C, D) is obtained by shifting (A, B, C) to the right by 1 unit, (C, D, E ) Is obtained by shifting (B, C, D) by 1 unit to the right, (D, E, F) is obtained by shifting (C, D, E) by 1 unit to the right.
[0028] The communication capability opening load forecasting method based on multi-layer LSTMRNN provided by the embodiment of the present invention performs normalization processing on the load data by obtaining the load data of the communication capability opening platform in a preset time period before the time period to be predicted. Transform the normalized load data into a shape suitable for the LSTM neural network, input the transformed load data into the pre-built and trained multi-layer LSTMRNN load prediction model, and reverse the output result This solution can accurately predict the load of the communication capability opening platform by obtaining the load prediction result of the time period to be predicted.
[0029] On the basis of the foregoing method embodiment, before inputting the transformed load data into the pre-built multi-layer LSTMRNN load prediction model, it may further include:
[0030] Train the pre-built multi-layer LSTMRNN load prediction model using the hourly API call volume and hourly peak TPS within a historical period of time with hourly granularity to obtain the trained multi-layer LSTMRNN for hourly API call volume prediction Load forecasting model and multi-layer LSTMRNN load forecasting model trained for hourly peak TPS prediction;
[0031] Wherein, the input of the transformed load data into the pre-built multi-layer LSTMRNN load prediction model includes:
[0032] Input the transformed hourly API call volume into the trained multi-layer LSTMRNN load prediction model for hourly API call volume prediction, and input the transformed hourly peak TPS into the trained for each hour Multi-layer LSTMRNN load forecasting model based on hourly peak TPS forecast.
[0033] In the embodiment of the present invention, the pre-built multi-layer LSTMRNN load prediction model is trained using the hourly API call volume during a historical period of time with hourly granularity to obtain a trained multi-layer LSTMRNN for hourly API call volume prediction The process of the load forecasting model is: normalize the hourly API call volume in a historical period of time with hourly granularity, and transform the normalized hourly API call volume into a shape suitable for the LSTM neural network , And use the changed shape of the hourly API call volume to construct the multi-layer LSTMRNN load prediction model in advance to obtain a trained multi-layer LSTMRNN load prediction model for hourly API call volume prediction. The processing process of the multi-layer LSTMRNN load prediction model trained for hourly peak TPS prediction is the same as the processing process of the multi-layer LSTMRNN load prediction model trained for hourly API call volume prediction. Go into details.
[0034] On the basis of the foregoing method embodiment, the multi-layer LSTMRNN load prediction model includes 1 input layer, 3 LSTM hidden layers, and 1 output layer. Each LSTM hidden layer contains 64 LSTM neurons. The output The layer contains 1 neuron.
[0035] On the basis of the foregoing method embodiment, the method may further include:
[0036] The load prediction result of the time period to be predicted is compared with the corresponding threshold. According to the comparison result, if the load prediction result is less than or equal to the corresponding threshold, the user is prompted that the predicted value is normal; otherwise, an early warning is given.
[0037] In the embodiment of the present invention, the comparison process is specifically: comparing the obtained API call volume prediction result with a preset API call volume threshold, and if it exceeds the threshold, an early warning is given to prompt the user to promptly confirm whether there are successful calls or a large number of failures. In the same way, the peak TPS prediction result is compared with the preset peak TPS call volume threshold. If the threshold is exceeded, an early warning will be given to prompt the user to temporarily limit the relevant northbound application. It is recommended to expand the capacity as soon as possible, otherwise it will affect the performance of the platform, thereby improving the risk prevention level of the operator's ability to open the platform, meeting its needs for responding to complex, changeable and volatile mobile Internet applications, making the behavior of mobile Internet applications no longer changeable Unpredictable and unpredictable.
[0038] See image 3 This embodiment discloses a communication capability open load forecasting device based on multi-layer LSTMRNN, including:
[0039] The preprocessing unit 1 is used to obtain load data of the communication capability opening platform in a preset time period before the time period to be predicted, normalize the load data, and transform the normalized load data Into a shape suitable for the LSTM neural network, wherein the load data includes hourly API calls and hourly peak TPS with hourly granularity;
[0040] The prediction unit 2 is used to input the transformed load data into the pre-built and trained multi-layer LSTMRNN load prediction model, and perform denormalization processing on the output result to obtain the load prediction result of the time period to be predicted , Wherein the multi-layer LSTMRNN load prediction model includes an input layer, an output layer and at least two stacked LSTM hidden layers.
[0041] Specifically, the preprocessing unit 1 obtains the load data of the communication capability opening platform in a preset time period before the time period to be predicted, performs normalization processing on the load data, and transforms the normalized load data Into a shape suitable for the LSTM neural network, wherein the load data includes hourly API calls and hourly peak TPS with hourly granularity; the prediction unit 2 inputs the transformed load data into the pre-built and trained Multi-layer LSTMRNN load prediction model, and denormalize the output result to obtain the load prediction result of the time period to be predicted. The multi-layer LSTMRNN load prediction model includes an input layer, an output layer and at least two layers Hidden layer of stacked LSTM.
[0042] It is understandable that normalizing the load data refers to scaling the load data so that it falls within the interval [0,1], and normalizing will increase the convergence speed of the model , Improve the accuracy of the model.
[0043] The communication capability opening load forecasting device based on multi-layer LSTMRNN provided by the embodiment of the present invention performs normalization processing on the load data by acquiring load data of the communication capability opening platform in a preset time period before the time period to be predicted, Transform the normalized load data into a shape suitable for the LSTM neural network, input the transformed load data into the pre-built and trained multi-layer LSTMRNN load prediction model, and reverse the output results This solution can accurately predict the load of the communication capability opening platform by obtaining the load prediction result of the time period to be predicted.
[0044] On the basis of the foregoing device embodiment, the device may further include:
[0045] The training unit is used to train the pre-built multi-layer LSTMRNN load prediction model using hourly API call volume and hourly peak TPS in a historical period of time with hourly granularity before the prediction unit works, and get trained The multi-layer LSTMRNN load forecasting model used for hourly API call volume prediction and the trained multi-layer LSTMRNN load forecasting model used for hourly peak TPS prediction;
[0046] Wherein, the prediction unit is specifically used for:
[0047] Input the changed shape of the hourly API call volume into the trained multi-layer LSTMRNN load prediction model for hourly API call volume prediction, and input the transformed shape of the hourly peak TPS into the trained for each hour Multi-layer LSTMRNN load forecasting model based on hourly peak TPS forecast.
[0048] In the embodiment of the present invention, the training unit trains the pre-built multi-layer LSTMRNN load prediction model by using hourly-granularity of the hourly API call volume in a historical period of time, and obtains the trained volume for hourly API call volume prediction. The process of the layered LSTMRNN load forecasting model is: normalize the hourly API call volume during a historical period of time with hourly granularity, and transform the normalized hourly API call volume into a suitable LSTM neural network The multi-layer LSTMRNN load prediction model constructed in advance using the changed shape of the hourly API call volume is used to obtain a trained multi-layer LSTMRNN load prediction model for hourly API call volume prediction. The training unit gets the training process of the multi-layer LSTMRNN load forecasting model for hourly peak TPS prediction and the processing process of the trained multi-layer LSTMRNN load forecasting model for hourly API call volume prediction. Here No longer.
[0049] On the basis of the foregoing device embodiment, the multi-layer LSTMRNN load prediction model includes 1 input layer, 3 LSTM hidden layers, and 1 output layer. Each LSTM hidden layer contains 64 LSTM neurons. The output The layer contains 1 neuron.
[0050] On the basis of the foregoing device embodiment, the device may further include:
[0051] The comparison unit is used to compare the load prediction result of the time period to be predicted with the corresponding threshold. According to the comparison result, if the load prediction result is less than or equal to the corresponding threshold, the user is prompted that the predicted value is normal; otherwise, Early warning.
[0052] The communication capability opening load forecasting apparatus based on the multilayer LSTMRNN of this embodiment can be used to implement the technical solutions of the foregoing method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here.
[0053] Figure 4 Shows a schematic diagram of the physical structure of an electronic device provided by an embodiment of the present invention, such as Figure 4 As shown, the electronic device may include: a processor 11, a memory 12, a bus 13, and a computer program stored on the memory 12 and running on the processor 11;
[0054] Wherein, the processor 11 and the memory 12 communicate with each other through the bus 13;
[0055] When the processor 11 executes the computer program, the method provided in the foregoing method embodiments is implemented, for example, including: acquiring the load data of the communication capability opening platform in a preset time period before the time period to be predicted, and comparing the load data Performing normalization processing to transform the normalized load data into a shape suitable for the LSTM neural network, where the load data includes hourly API calls and hourly peak TPS with hourly granularity; Input the transformed load data into the pre-built and trained multi-layer LSTMRNN load prediction model, and perform denormalization processing on the output result to obtain the load prediction result of the time period to be predicted, wherein The layer LSTMRNN load prediction model includes an input layer, an output layer and at least two stacked LSTM hidden layers.
[0056] The embodiment of the present invention provides a non-transitory computer-readable storage medium on which a computer program is stored. When the computer program is executed by a processor, the method provided in the foregoing method embodiments is implemented, for example, including: obtaining a communication capability open platform The load data in the preset time period before the time period to be predicted is normalized, and the normalized load data is transformed into a shape suitable for the LSTM neural network, wherein the Load data includes hourly API calls and hourly peak TPS with hourly granularity; input the transformed load data into the pre-built and trained multi-layer LSTMRNN load forecasting model, and the output results are back normalized In order to obtain the load prediction result of the time period to be predicted, the multi-layer LSTMRNN load prediction model includes an input layer, an output layer, and at least two stacked LSTM hidden layers.
[0057] Those skilled in the art should understand that the embodiments of the present application can be provided as methods, systems, or computer program products. Therefore, this application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
[0058] This application is described with reference to flowcharts and/or block diagrams of methods, devices (systems), and computer program products according to embodiments of this application. It should be understood that each process and/or block in the flowchart and/or block diagram, and the combination of processes and/or blocks in the flowchart and/or block diagram can be implemented by computer program instructions. These computer program instructions can be provided to the processor of a general-purpose computer, a special-purpose computer, an embedded processor, or other programmable data processing equipment to generate a machine, so that the instructions executed by the processor of the computer or other programmable data processing equipment are generated for use In the process Figure one Process or multiple processes and/or boxes Figure one A device with functions specified in a block or multiple blocks.
[0059] These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device. The device is implemented in the process Figure one Process or multiple processes and/or boxes Figure one Functions specified in a box or multiple boxes.
[0060] These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment. Instructions are provided to implement the process Figure one Process or multiple processes and/or boxes Figure one Steps of functions specified in a box or multiple boxes.
[0061] It should be noted that in this article, relational terms such as first and second are only used to distinguish one entity or operation from another entity or operation, and do not necessarily require or imply one of these entities or operations. There is any such actual relationship or order between. Moreover, the terms "include", "include" or any other variants thereof are intended to cover non-exclusive inclusion, so that a process, method, article, or device that includes a series of elements includes not only those elements, but also includes Other elements of, or also include elements inherent to this process, method, article or equipment. If there are no more restrictions, the element defined by the sentence "including a..." does not exclude the existence of other same elements in the process, method, article, or equipment including the element. The directions or positional relationships indicated by the terms "upper" and "lower" are based on the directions or positional relationships shown in the drawings, and are only for the convenience of describing the present invention and simplifying the description, and do not indicate or imply that the device or element referred to must be It has a specific orientation, is constructed and operated in a specific orientation, and therefore cannot be understood as a limitation to the present invention. Unless otherwise clearly specified and limited, the terms "installation", "connection" and "connection" should be interpreted broadly. For example, it can be a fixed connection, a detachable connection, or an integral connection; it can be a mechanical connection, It can also be an electrical connection; it can be directly connected, or indirectly connected through an intermediate medium, and it can be the internal communication between two components. For those of ordinary skill in the art, the specific meaning of the above-mentioned terms in the present invention can be understood according to specific circumstances.
[0062] In the specification of the present invention, a lot of specific details are explained. However, it can be understood that the embodiments of the present invention can be practiced without these specific details. In some instances, well-known methods, structures and technologies are not shown in detail, so as not to obscure the understanding of this specification. Similarly, it should be understood that in order to simplify the disclosure of the present invention and help understand one or more of the various aspects of the invention, in the above description of the exemplary embodiments of the present invention, the various features of the present invention are sometimes grouped together into a single embodiment. , Figure, or its description. However, the interpretation of the disclosed method should not reflect the intention that the claimed invention requires more features than those explicitly recorded in each claim. More precisely, as reflected in the claims, the inventive aspect lies in less than all the features of a single embodiment disclosed previously. Therefore, the claims following the specific embodiment are thus explicitly incorporated into the specific embodiment, wherein each claim itself serves as a separate embodiment of the present invention. It should be noted that the embodiments in the application and the features in the embodiments can be combined with each other if there is no conflict. The present invention is not limited to any single aspect, nor is it limited to any single embodiment, nor is it limited to any combination and/or replacement of these aspects and/or embodiments. Moreover, each aspect and/or embodiment of the present invention may be used alone or in combination with one or more other aspects and/or embodiments.
[0063] Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of the present invention, but not to limit them; although the present invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand: It is still possible to modify the technical solutions described in the foregoing embodiments, or equivalently replace some or all of the technical features; these modifications or replacements do not make the essence of the corresponding technical solutions deviate from the technical solutions of the embodiments of the present invention The scope shall be included in the scope of the claims and specification of the present invention.
PUM


Description & Claims & Application Information
We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
Similar technology patents
Abnormal load data detection and correction method and system based on model optimization
ActiveCN112733417AAccurate Load ForecastingPlanned Power Management
Owner:NANJING UNIV OF POSTS & TELECOMM
Classification and recommendation of technical efficacy words
- Accurate Load Forecasting
Abnormal load data detection and correction method and system based on model optimization
ActiveCN112733417AAccurate Load ForecastingPlanned Power Management
Owner:NANJING UNIV OF POSTS & TELECOMM