Managing a succession of deployments of an application programming interface (api) server configuration in the software lifecycle development

A pre-start, predictor technology, applied in the direction of program loading/starting, instrumentation, calculation, etc., can solve problems such as insufficient response to users or expected performance, less effective, etc.

Active Publication Date: 2016-03-23
MICROSOFT TECH LICENSING LLC
View PDF6 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

While these solutions improve the startup time of the application, they may not be sufficient to give t...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Managing a succession of deployments of an application programming interface (api) server configuration in the software lifecycle development
  • Managing a succession of deployments of an application programming interface (api) server configuration in the software lifecycle development
  • Managing a succession of deployments of an application programming interface (api) server configuration in the software lifecycle development

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0091] Example 1: Overall Rate Predictor

[0092] If the predictor has observed user duration d and event occurrence n times, a natural estimate is to use . if t 0 is the time when the predictor starts to observe the user, then t i is the number of observations of the target event, and t is the current time, then the predictor can take:

[0093]

[0094] give

[0095]

Embodiment 2

[0096] Example 2: Per-Context Rate Predictor

[0097]Assume that the predictor knows that rates can have different contexts. For example, suppose the rate varies on different days of the week. Then, if t = "13:21.02, Friday, May 31, 2013", you might use:

[0098]

[0099] If the rate of events of interest varies within the first minute of login, you might use:

[0100]

[0101] In general, if the context at time t (e.g., day of the week, time of day, or whether it is within or after a certain time logged in) is c(t), then it is possible to compute at each context c (e.g., on Sunday, Monday, ..., Saturday) the number of times the event of interest was observed, which can be represented by n c (t) represents and possibly computes the total duration of observing the user in each context c, which can be denoted by d c (t) representation, and use:

[0102]

[0103] If I c (t) is a function equal to 1 when the context is 1, and 0 otherwise, you might use:

[0104] ...

Embodiment 3

[0105] Example 3: Rate Predictor of Decay

[0106] As t increases, count n c (t) and duration d c (t) grows larger and becomes subject to increasingly outdated user behavior. To overcome this problem, recency-weighting may be introduced such that more recent actions are weighted more heavily in count and duration. One option could be exponential weighting, where, when estimating counts and durations at a later time Δ, at time The behavior of the given weights . In this case it is possible to use:

[0107]

[0108]

[0109]

[0110] If an event (in this case, an app switch) changes in context (e.g., a change in day) and the query (i.e., the PLM query predictor) moves forward in time ( forward) occurs, these counts (count), rate and duration (duration) can be calculated, which in Figure 7 Illustrated in flowchart form in and repeated here as follows:

[0111] 1. For each application (app) and context (context): set duration[app, context]=0, set count[app, con...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Example methods, systems, and techniques of managing a succession of deployments of an application programming interface (API) server configuration are provided. An example method includes defining a first configuration of the API server. The first configuration includes a deployment package that encodes at least policy, listener and external connection components of the defined first configuration together with environment settings particular to operation of the API server in a development environment deployment thereof. The method also includes preparing a second configuration of the API server. The second configuration includes (i) a policy package derived from the first configuration and (ii) a separable environment package particular to a testing environment deployment of the API server. The method further includes preparing a third configuration of the API server. The third configuration includes (i) the derived policy package and (ii) a separable environment package particular to a production environment deployment of the API server.

Description

Background technique [0001] Starting an application is an expensive operation due to the resources used during initialization. This problem may be further exacerbated with the proliferation of low-cost devices that may tend to have more hardware constraints than traditional desktops. In the past, solutions like caching have been used to speed up application startup. While these solutions improve the startup time of the application, they may not be sufficient to give the user sufficient responsiveness or desired performance, and may not be effective when cache space is limited. Contents of the invention [0002] The following presents a simplified overview of the invention in order to provide a basic understanding of some aspects described herein. This summary is not an extensive overview of claimed subject matter. It is intended to neither identify key or essential elements of the claimed subject matter nor delineate the scope of the subject invention. Its sole purpose i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F9/445
CPCG06F9/445G06N5/02G06F9/44578G06F9/485
Inventor A.比拉尔M.伊根M.克拉尔C.克利恩汉斯H.普拉帕卡A.基尚A.古纳瓦达纳P.科赫C.米克E.霍尔维奇R.卡鲁亚纳M.富丁
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products