Distributed type dynamic cache expanding method and system supporting load balancing

A distributed cache and load balancing technology, applied in transmission systems, digital transmission systems, electrical components, etc., can solve problems such as reducing migration overhead, and achieve the effects of reducing network overhead, improving performance, and shortening response time.

Inactive Publication Date: 2011-11-16
济南君安泰投资集团有限公司
View PDF3 Cites 153 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Aiming at the problem of data balancing during dynamic scaling, the method of the present invention considers the impact of hot data on system availability, and proposes a load balancing method suitable for heterogeneous environments
Aiming at the problem of guaranteeing data consistency during dynamic scaling, the present invention implements a data access protocol based on three-stage requests. At the same time, in order to eliminate the impact of data migration on system availability as much as possible, the present invention adopts a controlled data The migration method effectively controls the migration progress and reduces migration overhead

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed type dynamic cache expanding method and system supporting load balancing
  • Distributed type dynamic cache expanding method and system supporting load balancing
  • Distributed type dynamic cache expanding method and system supporting load balancing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further described below in conjunction with specific embodiments and accompanying drawings.

[0053] The entire distributed cache system consists of three parts: cache server (Cache server), cache client (Cache Client) and cache cluster manager (Cache Admin). The three are connected through the network. Each cache server runs independently, and is uniformly monitored and managed by the cache cluster manager through the management agent. The management agent is located on the same physical node as the cache server and is responsible for generating JMX management MBeans. After receiving the control command from the cache cluster manager, the management agent will automatically adapt the command and control the cache service process to perform corresponding operations.

[0054] In the cache cluster manager, the topology monitor uses Jgroups-based multicast technology to monitor the topology changes of server nodes, and obtains the performance...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a distributed type dynamic cache expanding method and system supporting load balancing, which belong to the technical field of software. The method comprises steps of: 1) monitoring respective resource utilization rate at regular intervals by each cache server; 2) calculating respective weighing load value Li according to the current monitored resource utilization rate, and sending the weighting load value Li to a cache clustering manager by each cache server; 3) calculating current average load value of a distributed cache system by the cache clustering manager according to the weighting load value Li, and executing expansion operation when the current average load value is higher than a threshold thremax; and executing shrink operation when the current average load value is lower than a set threshold thremin. The system comprises the cache servers, a cache client side and the cache clustering manager, wherein the cache servers are connected with the cache client side and the cache clustering manager through the network. The invention ensures the uniform distribution of the network flow among the cache nodes, optimizes the utilization rate of system resources, and solves the problems of ensuring data consistency and continuous availability of services.

Description

technical field [0001] The invention relates to a distributed cache dynamic scaling method and system thereof, in particular to a distributed cache dynamic scaling method and system supporting load balancing, and belongs to the field of software technology. Background technique [0002] In the cloud computing environment, the number of users and network traffic have experienced explosive growth. How to provide good support for large-capacity and business-critical transaction processing applications on the basis of cheap and standardized hardware and software platforms has become a problem faced by many enterprises. . Usually the server-side bottleneck occurs in the database. In order to further solve this problem, distributed caching technology is introduced. Distributed cache shortens the distance between clustered object data and applications, and is a key technology to accelerate data access and provide distributed data sharing. This technology plays a very important rol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04L29/08H04L12/56H04L29/06
Inventor 黄涛秦秀磊张文博魏峻钟华朱鑫
Owner 济南君安泰投资集团有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products