Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Performance pre-evaluation based client cache distributing method and system

A cache allocation, client-side technology, applied in transmission systems, electrical components, etc., can solve the problems of occupying cache resources, low client cache efficiency, slow write operation speed, etc., to achieve low cache resources, improve execution efficiency, and write operation speed. quick effect

Inactive Publication Date: 2014-03-26
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the above defects or improvement needs of the prior art, the present invention provides a client cache allocation method and system based on performance estimation, the purpose of which is to solve the problem of slow write operation and occupying a large amount of resources in the existing method. Caching resources and resulting in inefficient client-side caching technical issues

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Performance pre-evaluation based client cache distributing method and system
  • Performance pre-evaluation based client cache distributing method and system
  • Performance pre-evaluation based client cache distributing method and system

Examples

Experimental program
Comparison scheme
Effect test

example

[0057] In order to verify the feasibility and effectiveness of the system of the present invention, the system of the present invention is configured in a real environment, and the experiment is carried out by using the authoritative Benchmark in the field of supercomputing.

[0058] The cluster basic hardware and software configuration of the present invention are shown in Table 1 below:

[0059]

[0060] Table 1

[0061] The invention first analyzes the file writing request of the scientific application program to the parallel file system, collects and counts the running information of the program on the cluster; Various parameters customize the most reasonable client cache allocation strategy, so that the limited client cache can maximize the performance of the program's file write request. The system quickly, automatically and effectively provides client-side cache configuration strategies for parallel file systems, reduces the priority of complex client-side cache ope...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a performance pre-evaluation based client cache distributing method. The performance pre-evaluation based client cache distributing method comprises the following procedures of: firstly, counting loads of different data nodes in a parallel file system and collecting information such as a network speed and a magnetic disk write-read speed in the parallel file system at the same time; performing performance pre-evaluation on different system client cache distribution strategies by using the counted and collected information; selecting a client cache distribution strategy capable of bringing maximum performances by the system based on a performance pre-evaluation result; giving different priorities to different write requests based on the selected client cache distribution strategy; distributing the client cache to the write requests with relatively high priorities; directly writing the write requests with relatively low priorities into a magnetic disk. The performance pre-evaluation based client cache distributing method can solve the problems of high priority and low efficiency existing in the client distribution strategy of the existing parallel file system and maximizes performance improvement which can be brought by the limited client cache.

Description

technical field [0001] The invention belongs to the field of distributed computing models, and more particularly relates to a performance estimation-based client buffer allocation method and system. Background technique [0002] In the era of big data, data-intensive computing is facing new opportunities and challenges. The traditional stand-alone file system is powerless in the face of the demand of big data, and the parallel file system can meet the high-speed read and write requirements brought by big data due to its high throughput. However, under high concurrent data requests, the load of data nodes will be unbalanced, resulting in a sharp decline in the performance of the parallel file system. Therefore, improving the performance of parallel file systems has become an important research topic. [0003] In a parallel file system, the file write strategy is the main factor to improve performance. Traditional distributed file system write strategies are mainly divided ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/08
Inventor 金海石宣化黄亚宁吴松陆路
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products