Transparent computing server cache optimization method and system based on association pattern

A technology of transparent computing and association mode, applied in memory systems, computing, instruments, etc., can solve problems such as network and other service resource load, and achieve the effect of avoiding costs

Active Publication Date: 2017-12-15
CENT SOUTH UNIV
View PDF6 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the case of serving multiple clients, when a large number of users access the tran

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Transparent computing server cache optimization method and system based on association pattern
  • Transparent computing server cache optimization method and system based on association pattern
  • Transparent computing server cache optimization method and system based on association pattern

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0059] Example 1

[0060] This embodiment discloses a method for optimizing a cache on a transparent computing server based on an associative mode by relying on the improved FP-Stream. include:

[0061] Step S1: Process the data streams accessed by the user in batches, scan the data sets corresponding to each batch, record the transaction items that meet the screening conditions in the data sets corresponding to each batch, and filter the support count greater than or equal to τ*(σ-ε)*|B i |The data block to the data stream B for n ≥ 2 batches n Construct FP-tree; where σ is the minimum support, ε is the maximum support error, |B i | Represents the width of the data stream for batch i.

[0062] In this step, for n ≥ 2 batches of data stream B n The construction of FP-tree also uses the support coefficient τ to filter the original data stream to avoid the time and space cost of processing infrequently accessed data blocks. where σ, ε and |B above i Parameters such as | c...

Example Embodiment

[0087] Example 2

[0088] Corresponding to the above-mentioned Embodiment 1, this embodiment discloses a transparent computing server cache optimization system based on an association mode, including:

[0089] The first processing unit is used to process the data streams accessed by the user in batches, scan the data sets corresponding to each batch, record the transaction items that meet the filtering conditions in the data sets corresponding to each batch, and filter the support degree count greater than or equal to τ*(σ-ε)*|B i |The data block to the data stream B for n ≥ 2 batches n Construct FP-tree; where σ is the minimum support, ε is the maximum support error, |B i | Represents the width of the batch as i data stream;

[0090] The second processing unit is used to mine the frequent patterns and support count information of each batch of data streams by using the FP growth method. If a single prefix path appears in any conditional pattern base, and the frequencies of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the field of transparent computing technology big data mining, and discloses a transparent computing server cache optimization method and system based on an association pattern. The purpose is to reduce the I/O overhead of a disk, improve the hit rate of the cache and improve the transparent computing service quality. The method comprises the steps that support is introduced to screen various batches of data flows and construct an FP-tree; during mining the frequent pattern and support counting information of the data flows, if single prefix path of any conditional pattern base occurs and the frequencies of node elements on the path are identical, mining of combined frequent pattern subsets of the node elements with identical frequencies is stopped; based on the frequent pattern and support counting information of the data flows, an FP-Stream structure is created and updated; when any data block is read into the cache, data blocks corresponding to the other frequent items associated with relevant frequent patterns in the FP-Stream structure pattern of the corresponding data block are read into the cache together.

Description

technical field [0001] The invention relates to the field of big data mining of transparent computing technology, in particular to a method and system for optimizing a cache of a transparent computing server based on an association mode. Background technique [0002] Ubiquitous computing is a computing model that has been widely studied and applied since the 21st century. It emphasizes providing users with timely and effective services through contextual awareness of the digital environment. Transparent computing is a new ubiquitous computing model. Its main idea is to separate computing and storage, that is, user private data, applications, and operating systems are stored on remote servers instead of local machines. In this mode of separation of computing and storage, all user data, applications and operating systems can be combined on demand like software resources. [0003] The transparent service platform consists of a transparent client with a lightweight microkernel ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F12/0842G06F12/0862
CPCG06F12/0842G06F12/0862
Inventor 王斌陈琳李伟民盛津芳
Owner CENT SOUTH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products