Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cache data migration method and device

A technology for caching data and secondary caching, applied in the computer field, can solve problems such as repeated operations and damage to the performance of the Spark execution engine, and achieve the effects of avoiding elimination, avoiding repeated calculations, and protecting performance

Active Publication Date: 2017-12-15
SHENZHEN UNIV
View PDF12 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The main purpose of the present invention is to provide a cache data migration method and device, aiming at solving the problem that in the process of unified memory management in the prior art, when the cached intermediate data is removed, if the data needs to be used again, it must be re-executed Corresponding computing tasks, which cause repeated operations of some computing tasks and damage the technical problems of the performance of the Spark execution engine

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache data migration method and device
  • Cache data migration method and device
  • Cache data migration method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to make the purpose, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described The embodiments are only some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by those skilled in the art without making creative efforts belong to the protection scope of the present invention.

[0025] see figure 2 , figure 2 It is a schematic flowchart of the implementation of the cache data migration method provided in the first embodiment of the present invention, and the method can be applied to computer equipment. Such as figure 2 As shown, the cache data migration method mainly includes the following steps:

[0026] S101...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a cache data migration method and device, and relates to the field of computers. The method includes the steps of when the free space of a calculation area is not enough to run a subtask dispatched by a Spark execution engine, and calculating the size of preemption space, wherein the size of the preemption space is the size of the running space of the subtask minus the size of the free space of the calculation area; calculating the sizes of the free space and effective space of a storage area, and judging whether or not the size of the preemption space is larger than the size of the free space of the storage area and the size of the effective space of the storage area; when the size of the preemption space is larger than the size of the free space of the storage area and is not larger than the size of the effective space of the storage area, migrating the cache data of the storage area to second level cache, wherein the size of the cache data migrated to the second level cache is the size of the preemption space minus the size of the free space of the storage area. The method and the device avoid the repeated operations of part of calculation tasks and protect the performance of the Spark execution engine.

Description

technical field [0001] The invention relates to the field of computers, in particular to a cache data migration method and device. Background technique [0002] Currently, Spark is a distributed computing system, and as a general-purpose, fast large-scale data processing engine, it is widely used in the field of big data processing. Spark caches the intermediate calculation results and reads the cached iterative data, which can effectively avoid a large number of repeated calculations and improve the efficiency of big data calculations. [0003] With the continuous expansion of data sets, big data applications have a strong dependence on memory. What is more serious is that due to the continuous complexity of computing tasks, memory overflow faults in the computing area frequently occur. To solve this problem, the Spark execution engine provides and designs a unified memory management model. Such as figure 1 As shown, in the existing unified memory management model, the "...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/0811
CPCG06F12/0811
Inventor 毛睿陆敏华陆克中朱金彬隋秀峰
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products