Distributed type dynamic cache expanding method and system for supporting load balancing

A distributed cache and load balancing technology, applied in transmission systems, digital transmission systems, electrical components, etc., can solve problems such as reducing migration overhead, and achieve the effects of reducing network overhead, improving performance, and shortening response time.

Inactive Publication Date: 2013-09-18
济南君安泰投资集团有限公司
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Aiming at the problem of data balancing during dynamic scaling, the method of the present invention considers the impact of hot data on system availability, and proposes a load balancing method suitable for heterogeneous environments
Aiming at the problem of guaranteeing data consistency during dynamic scaling, the present invention implements a data access protocol based on three-stage requests. At the same time, in order to eliminate the impact of data migration on system availability as much as possible, the present invention adopts a controlled data The migration method effectively controls the migration progress and reduces migration overhead

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed type dynamic cache expanding method and system for supporting load balancing
  • Distributed type dynamic cache expanding method and system for supporting load balancing
  • Distributed type dynamic cache expanding method and system for supporting load balancing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be further described below in conjunction with specific embodiments and accompanying drawings.

[0053] The entire distributed cache system consists of three parts: cache server (Cache server), cache client (Cache Client) and cache cluster manager (Cache Admin). The three are connected through the network. Each cache server runs independently, and is uniformly monitored and managed by the cache cluster manager through the management agent. The management agent is located on the same physical node as the cache server and is responsible for generating JMX management MBeans. After receiving the control command from the cache cluster manager, the management agent will automatically adapt the command and control the cache service process to perform corresponding operations.

[0054] In the cache cluster manager, the topology monitor uses Jgroups-based multicast technology to monitor the topology changes of server nodes, and obtains the performance...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a distributed cache dynamic scaling method and system supporting load balancing, belonging to the technical field of software. The method is as follows: 1) each cache server regularly monitors its own resource utilization rate; 2) each cache server calculates its own weighted load value Li according to the currently monitored resource utilization rate, and sends it to the cache cluster manager; 3) cache The cluster manager calculates the current average load value of the distributed cache system according to the weighted load value Li. When it is higher than the threshold value thrmax, it performs an expansion operation; when it is lower than the set threshold value thremin, it performs a contraction operation. The system includes a cache server, a cache client and a cache cluster manager; the cache server is respectively connected to the cache client and the cache cluster manager through a network. The invention guarantees the balanced distribution of network traffic among each cache node, optimizes the utilization rate of system resources, and solves the problems of guaranteeing data consistency and continuous availability of services.

Description

technical field [0001] The invention relates to a distributed cache dynamic scaling method and system thereof, in particular to a distributed cache dynamic scaling method and system supporting load balancing, and belongs to the field of software technology. Background technique [0002] In the cloud computing environment, the number of users and network traffic have experienced explosive growth. How to provide good support for large-capacity and business-critical transaction processing applications on the basis of cheap and standardized hardware and software platforms has become a problem faced by many enterprises. . Usually the server-side bottleneck occurs in the database. In order to further solve this problem, distributed caching technology is introduced. Distributed cache shortens the distance between clustered object data and applications, and is a key technology to accelerate data access and provide distributed data sharing. This technology plays a very important rol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08H04L12/757H04L29/06
Inventor 黄涛秦秀磊张文博魏峻钟华朱鑫
Owner 济南君安泰投资集团有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products