Application method of resource process state management base on state driving engine

A process state and drive engine technology, applied in special data processing applications, electrical digital data processing, instruments, etc., to achieve the effect of flexible application scene support

Inactive Publication Date: 2016-08-03
INSPUR TIANYUAN COMM INFORMATION SYST CO LTD
2 Cites 2 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0004] The present invention provides an application method of resource process state management based on ...
View more

Abstract

The invention discloses an application method of resource process state management base on a state driving engine, and relates to state management in the charge of a network management system in a mobile communication network and a resource life cycle. According to application method, when the life cycle runs to a major state node, a resource exists in two business states simultaneously by triggering a transient state condition; resource process state management is carried out with a management table, and mainly comprises two steps of cache penetration and cache concurrence; flexible application field support can be realized by judging scene and life node states.

Application Domain

Special data processing applicationsMemory systems

Technology Topic

Process stateNetwork management +5

Examples

  • Experimental program(1)

Example Embodiment

[0015] Example:
[0016] The application method for resource process state management based on a state-driven engine in this embodiment uses a management table for resource process state management, including two steps of cache penetration and cache concurrency; the main contents of the cache penetration include: using The cache first checks whether there is a primary state in the resource cache, and if so, returns the cached content directly, then queries the database, and then queries the transition state scenario and returns the result; if a primary node resource queried does not exist in the cache, it does not provide feedback for calling the query database the transitional scene state.
[0017] The application method for resource process state management described in this embodiment is then buffered concurrently. If the cache is invalid, multiple threads concurrently call the database to query the main state of the resource and place it in the cache, then there are multiple resources in the cache with the same or different main states. recurs. Also, each main state's subordinate transition states is a set.
[0018] The management table includes 4 linked lists, the first two linked lists are obvious, one is the LRU linked list, which represents the most recently used page linked list; the other is the LFU linked list, which represents the most recently used page linked list; the other two linked lists are used for storing The page information that has been eliminated recently is called the ghost linked list. One is the LRUghost linked list, which stores the information of the pages recently eliminated from the most recently used linked list, and the other is the LFUghost linked list, which stores the information of the pages recently eliminated from the most frequently used linked list.
[0019] Through the application method of resource process state management described in this embodiment, resources in the cache will be moved to the LFU linked list; all resources accessed at least twice enter pages in the LRU linked list. If a page in the LFU list is accessed again, it will be placed at the beginning of the LFU list (most frequently used), so that those pages that are really frequently accessed will stay in the cache forever, and pages that are not frequently accessed will be sent to the The tail of the linked list moves and is eventually eliminated. As time goes on, these two linked lists are continuously filled, and the cache is filled accordingly; if the cache is full and a page that is not cached is read, a page must be eliminated from the cache to provide the new page read. Location. The page that has just been eliminated from the cache is not referenced by any non-ghost linked list in the cache. If the LRU linked list is full, the least recently used pages in the LRU linked list will be eliminated, and the eliminated pages will be put into the LRUghost linked list.
[0020] If a hit occurs in the LFUghost linked list, the length of the LRU linked list will be reduced, and a free space will be added to the LFU linked list. In this way, the ARC algorithm adapts to the workload. If the workload tends to access recently accessed files, more hits will occur in the LRUghost list, increasing the LRU cache space. If the workload tends to access recently frequently accessed files, more hits will occur in the LFUghost list, and the LFU cache space will increase.
[0021] In the application method of resource process state management described in this embodiment, the subordinate transition state of each main state is a set, and the set is between the LRU and the LFU. In order to improve the effect, the set consists of two LRUs , the first LRU becomes L1 and contains entries that have been used only once recently, while the second LRU is called L2 and contains entries that have been used twice recently; that is, L1 puts new objects, and L2 puts Commonly used objects. Match through the cached object model, and output the unique transition state matching result object to make a procedure call.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products