Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Task allocation method and system in node cluster, and node

A task allocation and clustering technology, applied in the field of deep learning, can solve the problems of high cost of multi-core CPU, low computing density, inability to mix CPU resources and GPU resources, etc.

Inactive Publication Date: 2017-09-05
NETPOSA TECH
View PDF5 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] While deep learning brings higher recognition rates and more video information, it also introduces the problem of a large amount of task calculation. The traditional method is to use CPU to complete the calculation amount. However, when the calculation amount is too large, single-core Or the computing speed of multi-core CPU still can't reach the use requirement; And even if adopting multi-core computing can satisfy the requirement, but has brought the defect that the cost of multi-core CPU is higher and computing density is lower; In the prior art, adopt special-purpose GPU Computing cards complete deep learning tasks. Although the computing speed is improved, CPU node clusters or GPU node clusters are set up during use, and CPU node clusters or GPU node clusters are used for task execution. Node clusters and GPU node clusters are managed and used separately, and they are independent of each other. Mixed computing of GPU node clusters and CPU node clusters cannot be performed, and CPU resources and GPU resources in the cluster cannot be mixed for task allocation.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Task allocation method and system in node cluster, and node
  • Task allocation method and system in node cluster, and node
  • Task allocation method and system in node cluster, and node

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments are only It is a part of embodiments of the present invention, but not all embodiments. The components of the embodiments of the invention generally described and illustrated in the figures herein may be arranged and designed in a variety of different configurations. Accordingly, the following detailed description of the embodiments of the invention provided in the accompanying drawings is not intended to limit the scope of the claimed invention, but merely represents selected embodiments of the invention. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without making...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a task allocation method and system in a node cluster, and a node. The node cluster comprises a plurality of computing nodes, the plurality of computing nodes comprise GPU nodes and CPU nodes, a plurality of GPU nodes constitute a GPU node cluster, and a plurality of CPU nodes constitute a CPU node cluster. The method comprises the following steps: a center node receives a user request, wherein the user request carries a to-be-executed task and a user-defined parameter; the center node controls any computing node to determine a server resource type and the service magnitude necessary for executing the task; and the center node allocates the task to the corresponding computing node according to the server resource type and the service magnitude necessary for executing the task and the state information of the current computing node. By adoption of the method provided by the embodiment of the invention, unified management and mixed use of the CPU node cluster and the GPU node cluster can be realized.

Description

technical field [0001] The present invention relates to the technical field of deep learning, in particular to a method, node and system for task allocation in a node cluster. Background technique [0002] In recent years, deep learning has achieved remarkable results in applications such as speech recognition, image recognition, and natural language processing; in the security industry, deep learning has begun to be implemented on a large scale, and many security companies have begun to invest resources in the development of deep learning-based technology. Tasks and products; it can be seen that deep learning is affecting security companies and intelligent video analysis technology; especially in face recognition applications and vehicle feature recognition applications. [0003] While deep learning brings higher recognition rates and more video information, it also introduces the problem of a large amount of task calculation. The traditional method is to use CPU to complet...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L29/08G06F9/50
CPCH04L67/1012H04L67/1029G06F9/505G06F9/5083H04L67/1001
Inventor 周光明李岩
Owner NETPOSA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products