Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A load balancing method and system based on double-layer cache

A load balancing and caching technology, applied in transmission systems, digital transmission systems, data exchange networks, etc., can solve problems such as resource waste, increased costs, unbearable, etc., and achieve the effect of reducing request pressure, reducing input costs, and reducing quantities

Active Publication Date: 2018-07-31
INSPUR BEIJING ELECTRONICS INFORMATION IND
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the continuous expansion of performance requirements and reliability of web servers, more and more enterprises use load balancers to enhance the concurrency of web services, and achieve the goal of increasing system reliability by building parallel clusters. Increase the computing resources of the enterprise's web services. If the number of background application servers is increased blindly, it will not only increase the cost for the enterprise, but also cause a waste of resources.
[0003] Most of the cache media are memory caches. As we all know, the price of 16GB memory sticks is already very expensive, and with the increase of memory capacity, the price of memory increases exponentially and the motherboard of the web server limits the upper limit of memory capacity. Therefore, relying solely on It is also difficult to increase the memory capacity of the web server to enhance the concurrent capability of the web server, and the high price is often unaffordable

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A load balancing method and system based on double-layer cache
  • A load balancing method and system based on double-layer cache
  • A load balancing method and system based on double-layer cache

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] Hereinafter, the present invention will be described in detail with reference to the drawings and examples. It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other.

[0017] figure 1 Shown is the flow chart of the load balancing method based on double-layer cache in Embodiment 1 of the present invention, including the following steps:

[0018] Step 101: Pre-setting a memory management module and a disk management module in the web server and setting parameters for the memory management module and the disk management module;

[0019] Set the following parameters for the memory management module:

[0020] 1. Set the connection timeout, send timeout, read timeout, cache invalidation time;

[0021] 2. Set the access request mode, that is, only directly accept internal access, and not directly receive external requests.

[0022] E.g:

[0023] memc_connect_t imeout...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a load balancing method and system based on double-layer cache. The method includes the following steps: pre-installing a memory management module and a disk management module in the web server and setting parameters for the memory management module and the disk management module. Setting; after the webpage server receives the terminal request, it obtains the data information requested by the terminal by querying the memory management module and feeds it back to the terminal; if the query fails, it continues to query the disk management module and obtains the The data information requested by the terminal and fed back to the terminal. The present invention increases the probability of responding to terminal user requests from the network layer, reduces the scheduling of requests to background application server clusters, reduces the request pressure of background servers, and simultaneously reduces the number of background application servers, thereby achieving the purpose of reducing investment costs.

Description

technical field [0001] The invention belongs to the field of load balancing, and in particular relates to a load balancing method and system based on a double-layer cache. Background technique [0002] With the continuous expansion of performance requirements and reliability of web servers, more and more enterprises use load balancers to enhance the concurrency of web services, and achieve the goal of increasing system reliability by building parallel clusters. The computing resources of the enterprise's web service are increased. If the number of background application servers is increased blindly, it will not only increase the cost for the enterprise, but also cause a waste of resources. [0003] Most of the cache media are memory caches. As we all know, the price of 16GB memory sticks is already very expensive, and with the increase of memory capacity, the price of memory increases exponentially and the motherboard of the web server limits the upper limit of memory capaci...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L12/803
Inventor 李有超王渭巍
Owner INSPUR BEIJING ELECTRONICS INFORMATION IND
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products