Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data storage method, data scheduling method, device and system

A data access and data technology, which is applied in the field of communication networks, can solve the problems of reducing data access performance, occupying a lot of memory for cache nodes, and complex data access process, so as to improve data access performance, improve data processing speed, The effect of simplifying the data access process

Active Publication Date: 2014-07-02
CHINA MOBILE COMM GRP CO LTD
View PDF7 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It can be seen from this that the introduction of virtual cache nodes makes the data access process more complicated and reduces the data access performance
[0006] 2. In a multi-client environment, the size of a single piece of data on different clients may be different, which results in that even though the number of pieces of data cached on each cache node is the same, the total data size is different, that is, some cache nodes occupy more memory. Many, some cache nodes occupy less memory, and the problem of uneven data distribution is not only not avoided, but may be enlarged, which in turn greatly reduces the memory utilization of cache server nodes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data storage method, data scheduling method, device and system
  • Data storage method, data scheduling method, device and system
  • Data storage method, data scheduling method, device and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] Aiming at the above-mentioned problems in the prior art, embodiments of the present invention provide a data access scheme and a data scheduling scheme. Embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0041] figure 1 The architecture of the distributed cache system provided by the embodiment of the present invention is shown, and the cache system supports data access requests of multiple clients. The system architecture may include: a client device 11 (there may be multiple ones, only one client device is shown in the figure), a server 12 and at least two cache nodes 13 . The client device 11 is used for data access; the server 12 is used for monitoring and scheduling data distribution of each cache node 13; the cache node 13 is used for caching data in memory and responding to the data access request of the client device 11.

[0042] The distributed cache system of the embodiment of the present i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data storage method, a data scheduling method, a device and a system. The data scheduling method comprises: the server of a distributed cache system obtaining memory use information of each cache node; the server, according to the memory use information of each cache node, determining a source cache node needing data migration, and determining a Hash barrel needing migration on the source cache node and a target cache node capable of accommodating data in the Hash barrel after the source cache node needing the data migration is determined; the server sending a data scheduling instruction to the source cache node so as to instruct the source cache node to migrate the data needing the migration in the Hash barrel to the target cache node; and the server sending a mapping relation updating instruction to a client which the migrated Hash barrel belongs to so as to instruct the client to update the mapping relation between the Hash barrel and the cache nodes according to data migration of this time.

Description

technical field [0001] The present invention relates to the technical field of communication networks, in particular to a data access method, scheduling method, equipment and system. Background technique [0002] In the Web 2.0 era, most applications on the Internet store data in relational databases, and clients read data from the databases. However, as the amount of data increases and the amount of data access increases, a series of problems such as increased database load, performance degradation, slow response, and website display delays will appear. In this context, a memory-based cache server emerges as the times require. . [0003] Usually, existing data access schemes that support multiple clients are implemented through a consistent hash algorithm. The consistent hashing algorithm realizes the distributed storage of data by projecting the hash values ​​of each cache node and data primary key onto a ring space. When judging which cache node a certain data falls in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L29/08
Inventor 梁智超钱岭周大孙少陵
Owner CHINA MOBILE COMM GRP CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products