Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Buffering capacity allocation method for massive logs

An allocation method and buffering technology, applied in the field of buffer allocation of massive logs, can solve problems such as unreasonable allocation of buffers, and achieve the effect of reducing I/O operations and memory resources.

Active Publication Date: 2015-07-01
SURFILTER NETWORK TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The technical problem to be solved by the present invention is to provide a buffer allocation method for massive logs in view of the defect of unreasonable buffer allocation in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Buffering capacity allocation method for massive logs
  • Buffering capacity allocation method for massive logs
  • Buffering capacity allocation method for massive logs

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In order to have a clearer understanding of the technical features, purposes and effects of the present invention, the specific implementation manners of the present invention will now be described in detail with reference to the accompanying drawings.

[0032] Such as figure 1 As shown, in the flow chart of a method for allocating a buffer amount of a massive log provided by a preferred embodiment of the present invention, the method is used to allocate the buffer amount of a subtable when reading in a massive log, and the overall structure of the WEB server is as follows image 3 As shown, it includes a general table, the general table is composed of multiple sub-tables, each sub-table is composed of multiple layers, each layer is composed of multiple segments, and a segment is the basic unit of processing. The method specifically includes:

[0033] S11. Read the log into the sub-table in real time, and store the log in a segment in the sub-table; the fields of the lo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a buffering capacity allocation method for massive logs. The method comprises an eleventh step of reading the log in a sublist in real time; a twelfth step of counting citation times of an offset of a domain appears for the first time; a thirteenth step of establishing citation volume of each segment, and computing total citation volume of the sublist; a fourteenth step of performing linear fitting on each segment of citation volume; and a fifteenth step of allocating a preset total buffering capacity in the sublist for each segment. The method has the beneficial effects of reasonably allocating the buffering capacity, occupying less memory resources, and reducing input / output (I / O) operation due to citation of domain offset.

Description

technical field [0001] The invention relates to the field of log management, and more specifically, to a method for allocating a buffer amount of massive logs. Background technique [0002] IDC (Internet Data Center, Internet Data Center), DNS (Domain Name Service, Domain Name System), etc. generate massive logs, which require fast real-time import (1-100,000 per second) and near-real-time search. To achieve the above-mentioned import or search goals, partitioning technology is generally used. This technology is to divide a large table into multiple small tables according to certain rules and store them in different areas, so that logically, a table can be stored in different locations like multiple tables when physically stored. , simplifies database management activities, and can also improve application performance. Partition according to the size of the data (the query time is generally within two hours, then the set threshold can correspond to it, try to make a search...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06F9/50
Inventor 吕成云唐新民沈智杰景晓军
Owner SURFILTER NETWORK TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products