Unlock instant, AI-driven research and patent intelligence for your innovation.

Load balancing method and device for cache server

A caching server and load balancing technology, applied in the field of games, can solve the problems of high ATS performance requirements, increase system cost, increase response time, etc., and achieve the effects of reducing response time, ensuring load balancing, and reducing costs

Active Publication Date: 2021-12-14
NETEASE (HANGZHOU) NETWORK CO LTD
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in the prior art, load balancing is achieved by adding LBC, which requires high performance of ATS, increases the cost of the system, and load balancing through LBC increases the response time of requests

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Load balancing method and device for cache server
  • Load balancing method and device for cache server
  • Load balancing method and device for cache server

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] In order to make the purposes, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the drawings in the embodiments of the present application. Obviously, the described embodiments It is a part of the embodiments of this application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of this application.

[0062] The terms "first", "second" and the like in the specification and claims of the present disclosure are used to distinguish similar objects, not to describe a specific order or sequence.

[0063] The present disclosure provides a load balancing method of a cache server, which is applied to a data storage system, figure 1 It is a structural diagram of a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present disclosure provides a load balancing method and device for a cache server, including: the cache server receives a first data request sent by a client, the first data request includes a source URL, and when the cache server does not store the data requested by the first data request For the target data, the cache server maps the source URL through the mapping plug-in to obtain the target source server where the target data is located, generates the target URL according to the information of the target source server and the source URL, and sends the second data request to the target source server. The second data request includes the target URL. In this method, a mapping plug-in is added to the cache server to map the data request to the source server, so that the load balance of the source data server can be ensured, and there is no need to add an intermediate layer between the cache server and the source server. Under the premise, the response time of the request is reduced, and the cost of the cache server is reduced.

Description

technical field [0001] The present disclosure relates to the technical field of games, and in particular to a load balancing method and device for cache servers. Background technique [0002] The cache server (Apache Traffic Server, referred to as ATS) has the advantages of high performance, scalability, scalability, and high modularity, and can be applied in a data storage system. The files in the data storage system are stored on multiple source servers, and a layer of ATS can be built in front of the source servers to cache repeatedly accessed files, so as to improve the response time of files and reduce the access pressure of the source servers. [0003] A data storage system usually includes multiple source servers. In order to optimize system performance, the ATS needs to evenly distribute the requests sent by the client among the source servers. In the prior art, in order to realize the load balancing of the source server, a layer of virtual load balancing cluster (L...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08A63F13/35A63F13/358
CPCA63F13/35A63F13/358H04L67/1001H04L67/568
Inventor 刘光亮
Owner NETEASE (HANGZHOU) NETWORK CO LTD