Method and system for multi-level caching of main thread shared by tasks in thread pool, and equipment

A thread pool and main thread technology, applied in special data processing applications, instruments, electrical and digital data processing, etc., can solve the problems that asynchronous tasks cannot share the data of main thread tasks, and the system processing capacity becomes low, and achieves fast processing speed. , the effect of reducing stress

Inactive Publication Date: 2021-10-22
广州嘉为科技有限公司
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] To sum up, the main problem existing in the prior art is: within one request, the asynchronous task cannot share the data of the main thread task, resulting in repeated network calls and lower system processing capacity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for multi-level caching of main thread shared by tasks in thread pool, and equipment
  • Method and system for multi-level caching of main thread shared by tasks in thread pool, and equipment
  • Method and system for multi-level caching of main thread shared by tasks in thread pool, and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The implementations described in the following exemplary examples do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of approaches consistent with aspects of the disclosure as recited in the appended claims.

[0037] The terminology used in the present disclosure is for the purpose of describing particular embodiments only, and is not intended to limit the present disclosure. As used in this disclosure and the appended claims, the singular forms "a", "the", and "the" are intended to include the plural forms as well, unless the context clearly dictates otherwise. It should also be understood that the term "and / or" as used herein refers to and includes any and all possible combinations of one or more of the associated listed items.

[0038] see figure 1 and figure 2 . figure 1 It is an exemplary flow chart of the multi-level caching method for sharing the main thread by tasks in the thread pool of the present...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method and system for multi-level caching of a main thread shared by tasks in a thread pool, and equipment. The method for multi-level caching of the main thread shared by the tasks in the thread poolcomprises the following steps: setting a multi-level cache query method; defining a thread pool, and setting a task execution method in the thread pool; starting a main thread, and caching the data to cache middleware and local according to a multi-level cache query method; and creating a sub-task in a main thread, submitting the sub-task to a defined thread pool, executing the sub-task, and obtaining data from the local. The system for multi-level caching of the main thread shared by the tasks in the thread pooll comprises: a definition setting module which is used for setting a multi-level cache query method and is used for defining a thread pool and setting a task execution method in the thread pool; a command execution module which is used for starting a main thread and caching the data to the cache middleware and the local according to a multi-level cache query method and is used for creating a sub-task in the main thread, submitting the sub-task to the defined thread pool, executing the sub-task and acquiring data from the local.

Description

technical field [0001] The invention relates to the technical field of computer development and application, in particular to a multi-level caching method, system and equipment for task sharing main threads in a thread pool. Background technique [0002] In the development of application systems, in order to respond to requests more quickly and improve the QPS of the system, we will use many methods, such as asynchronous and caching. In the case that a request is asynchronous, caching often needs to be implemented with the help of caching middleware, but the query caching middleware is also a network call, and if you want to use local caching, you need to maintain both local caching and caching middleware data. Consistency, in the case of multiple machines with high QPS, the maintenance cost is higher, the effect will be worse, and the coding difficulty is self-evident. [0003] To sum up, the main problem existing in the prior art is: within one request, asynchronous tasks...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/2455
CPCG06F16/24552
Inventor 张坚欣侯剑华邹方勇
Owner 广州嘉为科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products