Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A cache implementation method of interface call

A technology of interface calling and caller, which is applied in memory systems, instruments, and electronic digital data processing. It can solve the problems of caching server performance, waste of system resources, and low update frequency, so as to reduce network transmission overhead, reduce pressure, The effect of avoiding waste of system resources

Active Publication Date: 2016-09-21
重庆天极云服科技有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in the centralized caching solution, all calls must be implemented by calling the cache server through the network. In this way, in the case of a particularly large number of calls, it will cause network congestion and even performance problems of the cache server itself.
At the same time, for interfaces that are not updated in real time (that is, the update frequency is low), such as basic data interfaces, etc., this kind of overhead is unnecessary, so it will cause waste of system resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A cache implementation method of interface call

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] In order to make the purpose, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0016] The core idea of ​​the present invention is: effectively combining the centralized cache and the stand-alone cache, not only ensuring the data consistency of the stand-alone cache, but also making full use of the stand-alone cache, effectively reducing the pressure on the centralized cache, reducing network transmission overhead, and avoiding system Waste of resources.

[0017] figure 1 It is a schematic flow chart of Embodiment 1 of the present invention, such as figure 1 As shown, this embodiment mainly includes:

[0018] Step 101 , when the caller application program executes an interface call of an external application program, it queries the Ehcache cache of the stand-alone machine where it is located, and obtains the versio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This application discloses a cache implementation method for interface calls, which effectively combines centralized cache and stand-alone cache, uses the interface version number information stored in the stand-alone Ehcache cache, and monitors the timeliness of interface information in the stand-alone memory cache , and when the interface is called, the interface call return result is preferentially obtained from the local stand-alone cache. In this way, the local stand-alone cache can be fully utilized to obtain the return result of the interface call when the interface update frequency is low. In the case of high interface frequency The centralized cache can be used to obtain the return result of the interface call, which can not only solve the data consistency problem of the stand-alone cache, but also make full use of the stand-alone cache, effectively reduce the pressure of the centralized cache, reduce network transmission overhead, and avoid waste of system resources.

Description

technical field [0001] The invention relates to caching technology, in particular to a caching implementation method of interface calling. Background technique [0002] At present, in large-scale, high-concurrency and high-load WEB systems in the Internet and other fields, caching technology is generally used to improve performance. Without the support of caching technology, it is impossible to solve the large concurrency problem of the WEB system. Even if it is solved, it will cost a lot of money. Therefore, an excellent caching solution is very important for large-scale high-concurrency systems. [0003] According to different system scales, there are two existing caching technologies, one is stand-alone caching and the other is centralized caching. Stand-alone caching means that the WEB application and the cache are the same application, which is the simplest caching strategy. HashMap and List used in Java-based programs can all be counted as the category of caching. Ce...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F9/44G06F12/0813G06F3/06
Inventor 李鹏涛王进思李杰赵玉勇
Owner 重庆天极云服科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products