High-speed concurrent access method for power system large data files across platform

A power system and access method technology, which is applied in electrical digital data processing, memory systems, special data processing applications, etc., can solve the problem that the writing speed query efficiency is difficult to meet the needs of applications, and achieves convenience and stability. running effect

Active Publication Date: 2009-09-02
STATE GRID ELECTRIC POWER RES INST
View PDF0 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

With such a massive amount of power information, conventional relational databases will be difficult to meet the needs of applications in terms of writing speed and query efficiency.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-speed concurrent access method for power system large data files across platform

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] The present invention will be further described below in conjunction with accompanying drawing. figure 1 It is a flow chart of file processing by the cache management system in the implementation method of high-speed concurrent access to power system cross-platform big data files of the present invention.

[0016] The cache management system of the present invention has a high-speed buffer, and the purpose of the buffer is that if the data that has been loaded is accessed again, it does not need to be loaded again, thereby improving the efficiency and speed of access. The buffer has a certain size limit, When the buffer content exceeds this limit, the cache management system can automatically remove the least frequently used data from the high-speed buffer.

[0017] Processing flow of the present invention is:

[0018] When the application needs to process data, it obtains the relevant index information for specific data storage through a certain index mechanism. After...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a high-speed concurrent access method for power system large data files across a platform, which is characterized in that a cache management system capable of carrying out the unified management on all data files when a system runs and is responsible for interaction with actual disk files is set; the data files are processed in a memory-mapped file mode; the cache management system concurrently maps each portion of the data files to the minor address space in a data-processing service process one by one by using a method that the data files are divided into fixed block memory-mapped files; and the cache management system sets a high-speed buffer area which does not re-load loaded data accessing once more, thereby enhancing the efficiency and the speed of access. The high-speed concurrent access method of the power system large data files across the platform can meet the convenience, the rapidity and the reliability of an application program to the concurrent access of the large data files through the interaction of the common cache management system and the actual disk files.

Description

technical field [0001] The invention relates to a method for realizing high-speed concurrent access to cross-platform big data files in a power system, belonging to the field of power system data processing. Background technique [0002] With the continuous expansion of the scale of power grid construction and the deepening of the research and application of digital power grids and digital substations, the scale of data storage faced by power systems has increased by dozens of times, and the scale of data storage will shift from the current GB level to TB level. In addition, With the popularization of PMU acquisition devices and the development of wide-area dynamic monitoring system WAMS, the problem of storing massive power information data has become more prominent. Compared with RTU data acquisition, a prominent feature of PMU acquisition is that the acquisition frequency is very high, reaching 25, 50 or even 100 frames per second, and all data must be completely preserve...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06F12/08G06F12/0893
Inventor 张珂珩戴则梅葛云鹏季学纯
Owner STATE GRID ELECTRIC POWER RES INST
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products