Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System for implementing network search caching and search method

A network query and caching technology, applied in special data processing applications, instruments, electrical and digital data processing, etc., can solve the problems of high buffering timeliness, difficult and high timeliness of materialized views, and different problems, achieving high timeliness, improving The effect of macro throughput and performance

Inactive Publication Date: 2010-03-17
海南南海云信息技术有限公司
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantages of this approach are: 1) data objects are cached in memory, therefore, subject to the constraints of memory size, it is impossible to support the caching needs of a large number of data query result sets; 2) memcache memory data organization is in memory object mode, not In relational data mode, therefore, a dedicated access interface is provided for application development, and standard SQL statements are not accepted
The disadvantages of this approach are: 1) It is also constrained by the size of the memory; 2) The replacement strategy of cached data is based on the entire data table as a unit, rather than a finer-grained query request as a unit; 3) The query must be verbatim Only the same section can be considered the same, and the same conditions but different field sets are considered different requests; 4) The cached data only exists on the server where the database is located, and currently does not support multi-node in a distributed network environment cache between servers
The disadvantages of this approach are: 1) The replacement strategy of the materialized view is based on the entire data table, rather than a finer-grained query request; 2) It is difficult to ensure a relatively high timeliness of the materialized view in the network environment sex
This method can improve the efficiency of data query processing, but it is not convenient to solve the following requirements: 1) the sharing of result data sets between multi-user query requests; 2) it is difficult to ensure relatively high buffer timeliness in a distributed network environment, and at the same time It is easy to cause periodic high load on the network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System for implementing network search caching and search method
  • System for implementing network search caching and search method
  • System for implementing network search caching and search method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The specific embodiments of the present invention will be described in further detail below with reference to the accompanying drawings.

[0052] Such as image 3 As shown, according to an embodiment of the present invention, there is provided a system for implementing incremental caching of distributed queries in a network environment, including a query binder, a query parser, and a query descriptor manager. (query descriptor), query scheduler (queryscheduler), query cache (query cacher) five main functional modules. The function and realization of each component are as follows:

[0053] Query binder: For distributed query requests from multiple data sources, it is necessary to first use virtualization methods and view technology to represent the data required by the application as a virtual view. Since virtual views can form new virtual views through operations such as connection and union of relationships, virtual views required by complex applications can be constructed...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a caching system for realizing network inquiry, which comprises an inquiry binder, an inquiry resolver, an inquiry descriptor manager, an inquiry scheduler and an inquiry buffer. The query binder is used for binding virtual view meta-information relating to the examples of inquiry requests to a system memory and setting in sequence time marks of write operation of virtual views from a lower layer to an upper layer; the inquiry resolver is used for resolving the virtual views into a querying tree according to the validity of the examples of the inquiry requests; the inquiry descriptor manager is used for storing a queue of the examples of the inquiry requests imposed on the virtue views, determining examples of the inquiry requests to enter and exit the queue, and judging the validity of the examples of the inquiry requests; the inquiry scheduler is used for dispatching the inquiry nodes of the querying tree so as to execute inquiry; the inquiry buffer is used forestablishing and deleting temporary tables, managing data set in the temporary tables, updating time characteristics making of examples of the inquiry requests in the inquiry descriptor and outputting inquiry result. The cache of the system of the invention is carried out based on increment, thereby improving inquiring performance and throughout rate, and providing transparent support for the implementation process of the inquiry requests.

Description

Technical field [0001] The invention relates to the technical field of computer applications, in particular to a system and a query method for implementing network query caching for database applications in a network environment. Background technique [0002] As the processing and calculation of the business logic of the network application layer become more and more complex, a query request may involve instant access to multiple distributed data sources on the network. Such a large number of joint queries based on distributed data sources are affected by Affected by factors such as network bandwidth, request load, data volume, etc., access performance is often the bottleneck of this type of application. Therefore, research on performance optimization technology for this type of application mode has always been a hot spot. [0003] From the technical method level, so far, there are mainly two ways of implementing this type of distributed query caching mechanism to improve access pe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F17/30
Inventor 李晓林徐志伟谢毅
Owner 海南南海云信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products