System for implementing network search caching and search method

A network query and cache technology, applied in special data processing applications, instruments, electrical digital data processing, etc., can solve the problem of high buffering timeliness, difficult and high timeliness of materialized views, and it is impossible to support a large amount of data query result set caching needs and other problems to achieve high timeliness, improve macro throughput and performance

Inactive Publication Date: 2008-12-24
海南南海云信息技术有限公司
View PDF0 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantages of this approach are: 1) data objects are cached in memory, therefore, subject to the constraints of memory size, it is impossible to support the caching needs of a large number of data query result sets; 2) memcache memory data organization is in memory object mode, not In relational data mode, therefore, a dedicated access interface is provided for application development, and standard SQL statements are not accepted
The disadvantages of this approach are: 1) It is also constrained by the size of the memory; 2) The replacement strategy of cached data is based on the entire data table as a unit, rather than a finer-grained query request as a unit; 3) The query must be verbatim Only the same section can be considered the same, and the same conditions but different field sets are considered different requests; 4) The cached data only exists on the server where the database is located, and currently does not support multi-node in a distributed network environment cache between servers
The disadvantages of this approach are: 1) The replacement strategy of the materialized view is based on the entire data table, rather than a finer-grained query request; 2) It is difficult to ensure a relatively high timeliness of the materialized view in the network environment sex
This method can improve the efficiency of data query processing, but it is not convenient to solve the following requirements: 1) the sharing of result data sets between multi-user query requests; 2) it is difficult to ensure relatively high buffer timeliness in a distributed network environment, and at the same time It is easy to cause periodic high load on the network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System for implementing network search caching and search method
  • System for implementing network search caching and search method
  • System for implementing network search caching and search method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0052] Such as image 3 As shown, according to an embodiment of the present invention, a system for implementing incremental caching of distributed queries in a network environment is provided, including a query binder (query binder), a query parser (query parser), and a query descriptor manager (query descriptor), query scheduler (query scheduler), query cache (query cacher) five main functional modules. The functions and implementation of each component are as follows:

[0053] Query binder: For distributed multi-data source query requests, it is necessary to first use virtualization methods and view technologies to represent the data required by the application as a virtual view. Because virtual views can form new virtual views through operations such as connection and union of relations, it is possible to construct...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a caching system for realizing network inquiry, which comprises an inquiry binder, an inquiry resolver, an inquiry descriptor manager, an inquiry scheduler and an inquiry buffer. The query binder is used for binding virtual view meta-information relating to the examples of inquiry requests to a system memory and setting in sequence time marks of write operation of virtual views from a lower layer to an upper layer; the inquiry resolver is used for resolving the virtual views into a querying tree according to the validity of the examples of the inquiry requests; the inquiry descriptor manager is used for storing a queue of the examples of the inquiry requests imposed on the virtue views, determining examples of the inquiry requests to enter and exit the queue, and judging the validity of the examples of the inquiry requests; the inquiry scheduler is used for dispatching the inquiry nodes of the querying tree so as to execute inquiry; the inquiry buffer is used for establishing and deleting temporary tables, managing data set in the temporary tables, updating time characteristics making of examples of the inquiry requests in the inquiry descriptor and outputting inquiry result. The cache of the system of the invention is carried out based on increment, thereby improving inquiring performance and throughout rate, and providing transparent support for the implementation process of the inquiry requests.

Description

technical field [0001] The invention relates to the technical field of computer applications, in particular to a system and a query method for realizing network query cache for database applications in a network environment. Background technique [0002] As the processing and calculation of business logic in the network application layer become more and more complex, a query request may involve instant access to multiple distributed data sources on the network. Affected by factors such as network bandwidth, request load, and data volume, access performance is often the bottleneck of this type of application. Therefore, the research on performance optimization technology for this type of application mode has always been a hot spot. [0003] From the perspective of technical methods, so far, there are mainly two ideas to implement this type of distributed query caching mechanism to improve access performance. [0004] The first idea is to create a cache in the memory, and rea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
Inventor 李晓林徐志伟谢毅
Owner 海南南海云信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products