Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Distributed cache architecture with task distribution function and cache method

A distributed cache and task distribution technology, applied in the field of big data processing, can solve the problems that traditional databases cannot cope with large data volume processing and access, and achieve the effects of achieving availability, improving resource utilization, and automatically balancing data partitions

Inactive Publication Date: 2015-02-18
XIAN FUTURE INT INFORMATION CO LTD
View PDF4 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0015] The purpose of the present invention is to provide a distributed cache architecture with task distribution function, which solves the situation that traditional databases cannot cope with the processing and access of large amounts of data

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Distributed cache architecture with task distribution function and cache method
  • Distributed cache architecture with task distribution function and cache method
  • Distributed cache architecture with task distribution function and cache method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0046] The present invention is a distributed cache architecture with task distribution function, such as figure 1 As shown, it includes relational database Mysql2, redis distributed cache system 3 and distributed task scheduling system 1 connected to the client through the network;

[0047] Such as figure 2 As shown, the redis distributed cache system 3 includes a cache management module 4 , a distributed cache module 5 , a data distribution module 6 , a replacement algorithm module 7 , a cache synchronization module 8 , a cache communication module 9 , and a reliability service module 10 connected in sequence.

[0048] The working principle of the cache structure of the present invention is as follows: first, start the server end of the distributed task distribution system; secondly, the user writes a specific task processing module as a ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a distributed cache architecture with a task distribution function, which comprises a distributed task scheduling system, a relational database Mysql and a redis distributed cache system. The dynamic distributed cache architecture solves the situation that the traditional database can not process and access a large amount of data. The invention discloses another cache method adopting the distributed cache architecture, can achieve automatic task distribution and fault tolerance for scenes in which computation is required, and guarantees the reliability during processing.

Description

technical field [0001] The invention belongs to the technical field of big data processing, and relates to a distributed cache architecture method with a task distribution function. Background technique [0002] With more and more unstructured data, the performance requirements of the database in terms of data storage and processing are getting higher and higher, and the demand for read and write operations is also increasing. Traditional relational databases are struggling to deal with massive data processing. . For example, relational databases cannot be handy in terms of small amount of data storage, high-speed access, massive data storage, distributed system support, data consistency assurance, and convenient addition or deletion of cluster nodes. [0003] In view of the above problems, a memory object caching system such as mamcache is used in a typical application architecture. However, with the continuous increase of business volume and access volume, many problems w...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30H04L29/08
CPCG06F16/2471G06F16/24552G06F16/284H04L67/1097
Inventor 王茜葛新李安颖史晨昱梁小江
Owner XIAN FUTURE INT INFORMATION CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products