Index caching method and system based on off-heap memory

A technology of cache index and off-heap cache, applied in special data processing applications, instruments, electrical digital data processing, etc., to achieve the effect of no disk IO bottleneck and reducing performance bottleneck

Active Publication Date: 2017-08-18
HUNAN ANTVISION SOFTWARE
View PDF5 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The purpose of the present invention is to provide a method and system for cache indexing based on off-heap memory, aiming to solve the deficiencies in the prior art in the above-mentioned background art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Index caching method and system based on off-heap memory
  • Index caching method and system based on off-heap memory
  • Index caching method and system based on off-heap memory

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0021] refer to figure 1 , the present invention discloses a method for cache indexing based on off-heap memory in the first embodiment. When the capacity of off-heap memory is large enough, both reading and writing are in the memory, so that there is no disk interaction between reading and writing, which greatly reduces the impact caused by disk IO. performance bottlenecks, the method includes:

[0022] S1. When Lucene is in the startup state, after allocating a specified size of memory for index data in the off-heap memory and putting it into the memory pool, preheat the off-heap cache index;

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an index caching method and system based on an off-heap memory. The method comprises the steps that when Lucene is in a starting state, an off-heap cache index is preheated after a memory with a designated size is distributed for index data in the off-head memory and placed into a memory pool; when Lucene is in an indexing state, the index capacity of the off-heap memory is judged, and if the index capacity reaches a demand value, an output stream is opened in an off-heap memory index to write the index data; and when Lucene is in a searching state, whether the index data needing to be read currently exists in the off-heap memory index is judged, and if yes, an input stream is opened in the off-heap memory index to read the index data. The system corresponds to the method. Through the index caching method and system, a performance bottleneck brought by disk IO is greatly lowered, and data can be persisted; meanwhile, the situation that data real-time writing delay is caused when merging speed is lowered due to the disk IO bottleneck is avoided.

Description

technical field [0001] The invention belongs to the technical field of computer information storage indexing, and in particular relates to a method and system for realizing cache indexing of Apache Lucene based on off-heap memory. Background technique [0002] Lucene's index is divided into heap memory index, file system index and HDFS index. The in-heap memory index is implemented based on the in-heap memory of the JVM, and has no interaction with the disk, so there is no disk IO performance bottleneck. However, the data cannot be persisted, and there is a risk of data loss, and there is a time-consuming GC when the data is too large Too long and serious performance problems; file system indexes or HDFS indexes can persist data, but there is a disk IO performance bottleneck in an environment with a data volume of 100 million. Contents of the invention [0003] The purpose of the present invention is to provide a method and system for cache indexing based on off-heap memo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
CPCG06F16/134G06F16/172G06F16/182
Inventor 何小成黄三伟
Owner HUNAN ANTVISION SOFTWARE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products