Check patentability & draft patents in minutes with Patsnap Eureka AI!

Task scheduling method and system, computing equipment and storage medium

A task scheduling and task technology, applied in the field of data processing, can solve the problems of simultaneous processing, single node failure, lack of high availability, etc., to achieve the effect of improving efficiency and robustness, and improving efficiency

Active Publication Date: 2021-11-26
CHINA MOBILE GROUP ZHEJIANG +1
View PDF8 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, for the first method, when the amount of data to be processed is relatively large, the speed will be relatively slow and the efficiency will be low. Before data, it is necessary to verify whether the data has been processed, and there are situations where multiple nodes query a piece of unprocessed data at the same time, which may cause a piece of data to be processed by multiple nodes at the same time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Task scheduling method and system, computing equipment and storage medium
  • Task scheduling method and system, computing equipment and storage medium
  • Task scheduling method and system, computing equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present invention are shown in the drawings, it should be understood that the invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present invention and to fully convey the scope of the present invention to those skilled in the art.

[0022] figure 1 A flow chart showing an embodiment of a task scheduling method in the present invention, such as figure 1 As shown, the method includes the following steps:

[0023] S101: Multiple task processing nodes send registration requests to the scheduling node.

[0024] In an optional manner, step S101 further includes: waking up multiple task processing nodes according to a preset execution cycle, and randomly generating correspondin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a task scheduling method and system, computing equipment and a storage medium. The method comprises the steps that a plurality of task processing nodes send registration requests to a scheduling node; the scheduling node queries whether the task scheduling table contains a to-be-executed task, and if the to-be-executed task is queried, the plurality of task processing nodes are registered to generate node registration information; and the scheduling node obtains to-be-processed data corresponding to the to-be-executed task, splits the to-be-processed data according to the number of the to-be-processed data and the node registration information, and allocates the split to-be-processed data to the plurality of task processing nodes for processing. According to the invention, the to-be-executed task is dynamically scheduled to the plurality of task processing nodes through the scheduling node, and the data is concurrently processed through the plurality of task processing nodes, so that the situation that the plurality of task processing nodes compete to process the task or repeatedly process the task is avoided, the data processing efficiency is greatly improved, and the efficiency and robustness of large-batch data processing are improved.

Description

technical field [0001] The present invention relates to the technical field of data processing, in particular to a task scheduling method, system, computing device and storage medium. Background technique [0002] During the construction of the cloud operation and maintenance platform, it is necessary to collect and monitor various cloud resources, processes, services and other component instances in real time. The size of data monitored has also grown exponentially. In the prior art, when large-scale data is collected, a scheduled task is used for data synchronization and comparison. There are generally two commonly used methods: the first is that the scheduled task is deployed on a single node, and the data is processed one by one; The second is that the scheduled task is deployed on multiple nodes, and a corresponding mark is marked after processing a piece of data. Before each data processing using the second method, it is necessary to determine whether the data has bee...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/48
CPCG06F9/4881Y02D10/00
Inventor 李志勇陈挺顾黎斌丁强赵华锋
Owner CHINA MOBILE GROUP ZHEJIANG
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More