Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Task and data scheduling method and device based on hybrid memory

A technology of data scheduling and hybrid memory, which is applied in the field of data processing and can solve the problem of high consumption

Active Publication Date: 2016-07-13
HUNAN UNIV
View PDF3 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention mainly solves the technical problem that the task and data scheduling method in the prior art consumes too much in the process of reading data, and proposes a task and data scheduling method based on hybrid memory, which fully considers the impact of data on task scheduling, and improves The ability of data recognition greatly reduces energy consumption compared with other task scheduling technologies within the same time period

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Task and data scheduling method and device based on hybrid memory
  • Task and data scheduling method and device based on hybrid memory
  • Task and data scheduling method and device based on hybrid memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] figure 1 It is an implementation flow chart of the hybrid memory-based task and data scheduling method provided by the embodiment of the present invention. Such as figure 1 As shown, the task and data scheduling method based on hybrid memory DRAM and SSD provided by the embodiment of the present invention includes the following process:

[0047] Step 1. Obtain the input data and output data of the task according to the DAG diagram of the task and data.

[0048] Specifically, since each task has its own data that needs to be read, some tasks will also generate certain data, and some tasks need to use the data generated by another task, so tasks and data have dependencies. DAG A graph can represent the dependencies between tasks. Among them, DAG (DirectedAcyclicGraph, non-loop directed) graph is the relationship between tasks, for example, if v1->v2 is the direction, if v1 produces data d1, and the input data of v2 is d1, then this dependency It is the direction and c...

Embodiment 2

[0068] Figure 4 A schematic structural diagram of a hybrid memory-based task and data scheduling device provided by an embodiment of the present invention. Such as Figure 4 As shown, the task and data scheduling device based on hybrid memory DRAM and SSD provided by the embodiment of the present invention includes:

[0069] The data acquisition module is used to obtain the input data and output data of the task according to the DAG diagram of the task and data;

[0070] The data classification module is used to classify the input data and output data to obtain shared data and independent data, wherein the shared data is the input data of multiple tasks and can perform multiple memory access operations. Independent data is only used by one task and can perform a memory access operation;

[0071] The initialization scheduling module is used to obtain the processor with the fastest execution of the task according to the access time of the task to the processor and the memory...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of data processing, and provides a task and data scheduling method and device based on a hybrid memory. The method comprises the steps that 1, input data and output data of tasks are obtained according to a DAG graph of the tasks and the data; 2, data classifying is performed on the input data and the output data to obtain a shared data source and an independent data source; 3, a processor which executes the tasks most quickly is obtained according to the accessing time of the tasks to the processors and the memory, the tasks are allocated to the processor, and initialized scheduling is completed; 4, the input data is scheduled according to data classifying and initialized scheduling; 5, position adjusting is performed on the processor where the tasks are located and the memory where the data is located according to energy consumption generated when the processor processes the tasks and the memory accesses the data. According to the method and device, an influence of the data to task scheduling is fully considered, the data identification capacity is improved, and energy consumption is reduced.

Description

technical field [0001] The invention relates to the technical field of data processing, in particular to a hybrid memory-based task and data scheduling method and device. Background technique [0002] With the rapid development of information technology in recent years, the explosive growth of information and the real-time processing ability of information are contradictory. When hoping to quickly obtain the required information from massive data, the demand of enterprises for the real-time processing capability of information systems for massive data has never been so strong and urgent in the past. However, the inevitable performance problems of information systems have become a major challenge that enterprises must solve urgently. [0003] Judging from the past few years, there has been almost no major breakthrough in the speed of computer processors, but it has been accompanied by CPU processors moving from single-core to multi-core, from 2-core, 4-core, and then to 8-10...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/48
CPCG06F9/485G06F9/4893Y02D10/00
Inventor 李肯立陈俊杰唐卓李巧巧陈建国鲁彬
Owner HUNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products