Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data processing method and device used for distributed cache system

A distributed cache and partial data technology, applied in the direction of memory address/allocation/relocation, etc., can solve the problems of low memory utilization, different data volume, and unbalanced data volume in the distributed cache system, so as to improve memory utilization , the effect of data volume balance

Active Publication Date: 2015-12-23
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, in an environment with multiple proxy servers, and each proxy server is connected to at least one cache server, the amount of data in the memory of different cache servers is different due to the different numbers of cache servers connected to each proxy server. Therefore, it may happen that the memory usage of some cache servers is close to the total memory capacity, while the memory usage of other cache servers is relatively small. low memory utilization

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing method and device used for distributed cache system
  • Data processing method and device used for distributed cache system
  • Data processing method and device used for distributed cache system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain related inventions, rather than to limit the invention. It should also be noted that, for the convenience of description, only the parts related to the related invention are shown in the drawings.

[0027] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present application will be described in detail below with reference to the accompanying drawings and embodiments.

[0028] figure 1 An exemplary system architecture 100 to which embodiments of the data processing method or data processing apparatus of the present application can be applied is shown.

[0029] Such as figure 1 As shown, the system architecture 100 may include terminal dev...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data processing method and device used for a distributed cache system. The data processing method concretely includes the steps that the current occupied memory capacity and total memory capacity of each cache sever in the distributed cache system are obtained; according to the current occupied memory capacity and total memory capacity, data balance processing is performed on all the cache servers and includes the steps that based on the current occupied memory capacity and total memory capacity, the cache servers in the distributed cache system are divided into a high-load server set and a low-load server set; one cache server in the high-load server set is selected as a source cache server, one cache server in the low-load server set is selected as a target cache server, and partial data in the source cache server are transferred into the target cache server. In implementation, the memory utilization rate of the distributed cache system is increased.

Description

technical field [0001] The present application relates to the field of computer technology, in particular to the field of distributed cache technology, and in particular to a data processing method and device for a distributed cache system. Background technique [0002] At present, with the continuous expansion of the storage scale of cache (Cache) data in various industries, in order to provide higher-capacity data, multiple cache servers (CacheServer) are usually used to store data, and through proxy servers (Proxy) according to A predetermined hash (hash) algorithm distributes the data to each cache server. At the same time, a metadata management server will be deployed to manage the storage location information of these data. [0003] However, in an environment with multiple proxy servers, and each proxy server is connected to at least one cache server, the amount of data in the memory of different cache servers is different due to the different numbers of cache servers...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08
Inventor 张东阳
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products