Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Distributed cache automatic management system and distributed cache automatic management method

A distributed cache and automatic management technology, applied in transmission systems, electrical components, etc., can solve problems such as complex cache management methods, program defects, and complex cache management protocols, to simplify cache management protocols, reduce difficulty, and reduce errors. the effect of the possibility of

Inactive Publication Date: 2013-09-11
NEC (CHINA) CO LTD
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

On the one hand, this brings an additional burden to the programmer, and on the other hand, it is easy to cause potential program defects
[0006] In addition, this relatively complex cache management method also brings some complexity to the cache management protocol.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed cache automatic management system and distributed cache automatic management method
  • Distributed cache automatic management system and distributed cache automatic management method
  • Distributed cache automatic management system and distributed cache automatic management method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings, and unnecessary details and functions for the present invention will be omitted during the description to prevent confusion in the understanding of the present invention.

[0026] figure 2 is a schematic diagram for illustrating the distributed cache automatic management system 200 according to the present invention.

[0027] Such as figure 2 As shown, the distributed cache automatic management system 200 includes three parts: a client (Client) 210, a master control terminal (Master) 220 and a cache server cluster (Cache Servers) 230 (for simplicity of description, figure 2 Only two cache servers 230 are shown in 1 and 230 2 , but the present invention is not limited to the specific number of cache servers, any number of cache servers 230 can be arranged as required 1 ~230 N ). The programs running on the client 210 are written by ap...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a distributed cache automatic management scheme used for parallel computing of data parallel. The lifetime of a data set object and the lifetime of a corresponding distributed data set of a client side are bound, namely when the data set object is established or destroyed in the client side, the corresponding distributed data set is established or destroyed accordingly in a caching server cluster. The burden of a programmer is lightened, and the possibility of errors is lowered. Moreover, a cache management protocol is simplified.

Description

technical field [0001] The present invention relates to distributed cache automatic management, in particular to a distributed cache automatic management scheme for parallel computing of data parallelism. Background technique [0002] With the rapid development of the Internet, Internet data has also achieved explosive growth. The analysis, processing and mining of these data are of great significance to Internet service providers and traditional industries in related fields. However, due to the huge scale of these data, how to effectively process these data becomes a great challenge. [0003] In order to process huge web data, a parallel computing platform named "MAP-REDUCE" has been developed. The MAP-REDUCE platform can efficiently process data-parallel parallel computing services. After the MAP-REDUCE system was widely used and achieved great success, an open source project HADOOP computing platform based on the design of MAP-REDUCE was released, and quickly achieved ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L29/08
Inventor 黄权罗彦林
Owner NEC (CHINA) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products