Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Computing resource allocation method and device based on hybrid distribution architecture and storage medium

A technology of hybrid computing and computing resources, which is applied in the field of big data processing, can solve problems such as resource preemption, configuration process and user interface are different, and cannot meet the real-time requirements of computing tasks, so as to achieve efficient and flexible scheduling and meet the effects of diversity

Active Publication Date: 2020-01-17
NAT COMP NETWORK & INFORMATION SECURITY MANAGEMENT CENT
View PDF9 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In recent years, many deep learning frameworks built on Spark have been proposed. Although they are all set up on Spark, the configuration process and user interface are different, and resource preemption will also occur when multiple frameworks are used at the same time.
On the other hand, the core resources of traditional computer computing and processing generally refer to the central processing unit (CPU). However, with the development of artificial intelligence and other technologies, the computing requirements are also diversified, and the pure CPU can no longer meet the real-time performance of computing tasks. Require

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computing resource allocation method and device based on hybrid distribution architecture and storage medium
  • Computing resource allocation method and device based on hybrid distribution architecture and storage medium
  • Computing resource allocation method and device based on hybrid distribution architecture and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to further explain the technical means and functions adopted by the present invention to achieve the intended purpose, the present invention will be described in detail below in conjunction with the accompanying drawings and preferred embodiments.

[0028] Such as figure 1 As shown, it is a distributed computing system based on hybrid computing resources provided by an embodiment of the present invention, including a computing engine layer 11, a hybrid computing encapsulation layer 13, and a resource scheduling layer 12, wherein:

[0029] Described computing engine layer 11 is made up of a plurality of deep learning frameworks built on the same Spark computing engine;

[0030] The hybrid computing encapsulation layer 13 is used to uniformly encapsulate the access interfaces of each deep learning framework for the computing engine layer 11;

[0031] The resource scheduling layer 12 includes multiple heterogeneous computing resources, and the heterogeneous compu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a distributed computing system based on hybrid computing resources, used for reasonable allocation of resources, to meet the requirement for diversity of computing tasks. The system comprises a computing engine layer and a resource scheduling layer. The computing engine layer is composed of a plurality of deep learning frameworks constructed on the same Spark computing engine, and access interfaces of all the deep learning frameworks are packaged in a unified mode for the computing engine layer. The resource scheduling layer comprises a plurality of heterogeneous computing resources, and the heterogeneous computing resources comprise at least one of a CPU, a GPU and an FPGA. In the resource scheduling layer, different task queues are divided according to the task types of the to-be-processed tasks, different logic clusters are divided according to the types of computing resources carried by different physical machines, and the tasks in the task queues are distributed to the corresponding logic clusters for execution according to the task types of the to-be-processed tasks.

Description

technical field [0001] The present invention relates to the technical field of big data processing, in particular to a computing resource allocation method, device and storage medium based on a hybrid distributed architecture. Background technique [0002] The rise of big data technology has once again inspired the vitality of artificial intelligence. The Go game in 2016 once again ignited the upsurge of artificial intelligence. It is precisely because of the support of mature and stable big data technology that the massive computing tasks behind AlphaGo can be completed. Apache Spark is a fast and general-purpose computing engine designed for large-scale data processing. It has the advantages of Hadoop MapReduce, but unlike MapReduce, the intermediate output results of jobs can be stored in memory, so that there is no need to read and write HDFS (distributed file system), so Spark can be better suited for data mining and machine learning algorithms that require iteration. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/50G06F9/455G06F9/48H04L29/08
CPCG06F9/5027G06F9/5038G06F9/45558G06F9/4881H04L67/10G06F2009/45595G06F2209/484G06F2209/5021Y02D10/00
Inventor 钮艳杜翠兰赵淳璐李扬曦项菲李鹏霄佟玲玲张丽王祥井雅琪
Owner NAT COMP NETWORK & INFORMATION SECURITY MANAGEMENT CENT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products