Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for storing multiple cache queues in parallel

A storage method and cache queue technology, applied in the field of parallel storage of multiple cache queues, can solve the problems of high security processing requirements, congestion, and large storage capacity.

Active Publication Date: 2018-01-26
河南德朗智能科技有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Traditional desktop databases cannot adapt to parallel operations; while network databases not only realize resource sharing through network channels, but also can be used in parallel and distribute different jobs and data processing on different computers. Large capacity, but if the data is saved according to the conventional method, congestion will occur, that is, it cannot adapt to the competition of thread resources, nor can it adapt to the real-time storage requirements of the 20ms channel to generate data collection speed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for storing multiple cache queues in parallel

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The technical solutions of the present invention will be described in further detail below through specific implementation methods.

[0026] Such as figure 1 As shown, a multi-cache queue parallel preservation method includes the following steps:

[0027] S1, determine the type of record data, and apply for a cache space and a data saving thread for each record data in the memory, each cache space includes 1000 cache queues, and the number of the cache queues is 0 to 999, each The cache queue is linked in the form of a linked list;

[0028] Preferably, the record data is divided into three types: thickness record data DeepThick, waveform record data Wave_New and alarm record data AlarmDat;

[0029] Specifically, the data structure of the cache corresponding to the thickness record data is: ClassArrDeepBuff[] Arr_Buff_Deep = new ClassArrDeepBuff[1000]; the storage type is ClassArrDeepBuff, and the variable is List lst_Buff_Deep data;

[0030] The data structure of ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for storing multiple cache queues in parallel. The method comprises the following steps of: S1, determining categories of recorded data and applying a cache space and adata storage thread for each category of data in a memory, wherein each cache space comprises N cache queues which are connected in a chain table form; S2, receiving recorded data sent by external equipment, determining a corresponding cache space according to the category of the recorded data and storing the recorded data into a corresponding cache queue in the corresponding cache space; and S3,synchronously starting a plurality of data storage threads, sorting the recorded data in the corresponding cache space into a database and deleting the recorded data stored in the database from respective cache spaces.

Description

technical field [0001] The invention relates to a data storage method, in particular to a multi-buffer queue parallel storage method. Background technique [0002] Underground gas storage wells have 64 channels of multi-probes, which generate measurement data within 20ms, and after processing, generate thickness records (DeepThick_X_X), waveform records (Wave_New_X_X), and alarm records (AlarmDat_X_X); among them, waveform records occupy a particularly large amount of data, 1 The record is up to 6.4M bytes, and it is about 6K bytes after compression. [0003] Traditional desktop databases cannot adapt to parallel operations; while network databases not only realize resource sharing through network channels, but also can be used in parallel and distribute different jobs and data processing on different computers. The capacity is large, but if the data is saved according to the conventional method, it will cause congestion, that is, it cannot adapt to the competition of threa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F17/30
Inventor 孙景照陈大伟韩有华周波娄旭耀夏锋社余哲贾立军
Owner 河南德朗智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products