Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Parallel task processing method based on task decomposition

A task processing and task decomposition technology, applied in parallel task processing and computer application fields, can solve problems such as unfavorable business update and maintenance, inconsistent with programming ideas, etc., to achieve the effect of easy encapsulation

Active Publication Date: 2015-07-22
ZHEJIANG UNIV
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] (1) When the concurrency is high, the thread pool must allocate a thread for each trigger event, and thread context switching, memory synchronization, etc. will introduce additional overhead
[0005] (2) Developers are required to understand the details of task processing and design a specialized class structure, which does not conform to the programming ideas of object-oriented and design patterns, and is not conducive to business update and maintenance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Parallel task processing method based on task decomposition
  • Parallel task processing method based on task decomposition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The present invention will be further described in detail below with reference to the accompanying drawings and specific embodiments.

[0022] A kind of parallel task processing method based on task decomposition of the present invention comprises the following steps:

[0023] (1) Create task decomposition queues, task allocation queues, response filtering queues, and various task processing queues; the task decomposition queues, task allocation queues, response filtering queues, and various task processing queues are all implemented based on blocking queues;

[0024] (2) Create task decomposition threads, task allocation threads, response filtering thread groups and various task processing thread groups;

[0025] (3) The task decomposition thread obtains the task object from the task decomposition queue, decomposes the subtask according to the task code of the task object, and stores the subtask in the task hash table. After the decomposition is completed, the task obj...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a parallel task processing method based on task decomposition. According to the method, event trigger objects do not need to be sequentially transmitted by a plurality of queues, the queues into which the objects are inserted are decided by state parameters of the objects, threads do not need to know the task queues from which the objects operated by the threads are transmitted or the task queues from which the objects operated by the threads are about to be inserted, and the threads only need to complete self task processing. The event trigger objects, the task queues and the threads do not influence one another, and class packing is facilitated. When multiple subtasks of services have no order dependencies, the subtasks can be inserted into multiple queues at the same time, and parallelization processing of the subtasks is carried out. Processing thread sets of a fixed number are distributed to the tasks, and a dedicated thread does not need to be created for each object.

Description

technical field [0001] The invention relates to the field of computer applications, in particular to the field of parallel task processing. Background technique [0002] figure 1 Model for thread-based serial task processing. In order not to block the task reading thread, an independent thread is applied to the thread pool for task processing. After the task is completed, it is returned to the thread pool for reuse. [0003] Although this task processing model realizes the concurrent processing of tasks, when the task includes multiple other types of subtasks, the worker thread can only process the subtasks sequentially. The serial task processing model mainly has the following disadvantages: [0004] (1) When the amount of concurrency is high, the thread pool must allocate a thread for each trigger event, and additional overhead will be introduced for thread context switching and memory synchronization. [0005] (2) Developers are required to understand the details of t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/48
Inventor 王友钊黄静
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products