Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache data sharing method and equipment

A caching data and caching technology, which is applied in the computer field, can solve problems such as increased operation, increased server load, and increased delay, and achieves the effects of reducing connections, saving cache and computing resources, and reducing load

Active Publication Date: 2017-02-22
ALIBABA GRP HLDG LTD
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Therefore, the data subscription behavior will also increase with the increase of the number of processes, which will lead to the following problems: a) Since the subscription behavior of each process is attached to the independent connection to the server established by the process, with the increase of the number of connections increase, it will increase the load on the server, which poses a serious challenge to the performance of the server
b) Due to the increase in server load, the delay in obtaining feedback for a single subscription behavior increases, which in turn affects the timeliness of the process to obtain the latest content of subscription data
c) Since each process has an independent cache, an increase in the number of subscriptions on a single computing node means an increase in the consumption of cache resources and an increase in operations for processing subscription feedback, which in turn consumes the computing resources of the computing node

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache data sharing method and equipment
  • Cache data sharing method and equipment
  • Cache data sharing method and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The application will be further described in detail below in conjunction with the drawings.

[0063] In a typical configuration of this application, the terminal, the device of the service network, and the trusted party all include one or more processors (CPU), input / output interfaces, network interfaces, and memory.

[0064] The memory may include non-permanent memory in computer readable media, random access memory (RAM) and / or non-volatile memory, such as read only memory (ROM) or flash memory (flashRAM). Memory is an example of computer readable media.

[0065] Computer-readable media includes permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology. The information can be computer-readable instructions, data structures, program modules, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The objective of the invention is to provide a cache data sharing method and equipment. Concretely, a data access request transmitted by the agented process and related to a data file is acquired at a cache agent node side, and the data file is transmitted to the agented process according to the data access request; the process on one or multiple computational nodes is managed through the cache agent node, and the data file required to be used by the agented process is saved in the cache of the cache agent node so that the process on the computational nodes does not need to maintain its independent cache space, the specific data file in the cache of the cache agent node can be shared for multiple processes and cache and computational resources can be saved; meanwhile, the agented process does not need to directly establish connection with a server, and only one connection is established with the server for multiple processes under management of the same cache agent node so that large volume of connection to the server caused by the subscription behavior can be reduced and the load of the server can be reduced.

Description

Technical field [0001] This application relates to the computer field, and in particular to a method and device for sharing cached data. Background technique [0002] In large-scale distributed computing systems, in order to speed up access to server-side data, processes in computing nodes usually introduce caches to manage frequently accessed data. In a distributed scenario, server-side data may change at any time. Therefore, in order to ensure the timeliness of the cache maintained by each process, the cache managed by the process needs to be updated from time to time. However, whether each computing node regularly pulls (Pull) data from the server, or the server occasionally pushes (Push) data to the cache of each computing node process, it will bring access pressure to the distributed computing system. [0003] In the distributed coordination system, the process on the computing node can use "cache + subscription" to obtain the latest content of the subscribed data file, and t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/08
CPCH04L67/1001H04L67/568H04L65/40
Inventor 朱云锋成柱石陶云峰
Owner ALIBABA GRP HLDG LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products