Transparent caching traffic optimization method, load balancer and storage medium

A load balancer and traffic optimization technology, applied in the network field, can solve problems such as large response delay and slow download rate, and achieve the effect of reducing response delay, increasing download rate, and solving resource consumption problems.

Inactive Publication Date: 2019-04-02
ZTE CORP
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0020] In view of the above-mentioned reasons and in order to overcome the above-mentioned defects, the technical problem to be solved by the present invention is to provide a traffic optimization method for transparent caching, a load balancer and a storage medium to solve the problem of user The problem of long response delay and slow download rate of cacheable service requests

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Transparent caching traffic optimization method, load balancer and storage medium
  • Transparent caching traffic optimization method, load balancer and storage medium
  • Transparent caching traffic optimization method, load balancer and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] Such as figure 2 As shown, the embodiment of the present invention provides a transparent caching traffic optimization method, which is used in a load balancer in a transparent caching system in a cloud / virtualized environment, and includes:

[0038] S101: When receiving a request from a user to access a website (that is, an HTTP request), determine the business nature of the access request;

[0039] S102: Adjust the transmission path of the business traffic according to the business nature of the access request; the business traffic includes the access request and file data of the website in response to the access request.

[0040] In the embodiment of the present invention, when the load balancer receives a user request, it adjusts the transmission path of the service flow according to the service nature of the user request, avoiding that all user requests in the existing mode are all forwarded to the cache server, thereby solving the problem of the traditional mode The res...

Embodiment 2

[0101] Such as Figure 4 As shown, an embodiment of the present invention provides a load balancer in a transparent cache system in a cloud / virtualization environment, the load balancer includes a memory 10 and a processor 12; the memory 10 stores a flow optimization computer program; The processor 12 executes the computer program to implement the following steps:

[0102] When receiving a request from a user to visit a website, determine the business nature of the request;

[0103] According to the business nature of the request, the transmission path of the business traffic is adjusted; the business traffic includes the request and the file data of the website in response to the request.

[0104] In the embodiment of the present invention, when the load balancer receives a user request, it adjusts the transmission path of the service flow according to the service nature of the user request, avoiding that all user requests in the existing mode are all forwarded to the cache server,...

Embodiment 3

[0120] An embodiment of the present invention provides a computer-readable storage medium that stores a transparently cached traffic optimization computer program in a cloudification / virtualization environment. When the computer program is executed by at least one processor, the computer program The steps of any one of the methods in Example 1.

[0121] The computer-readable storage medium in the embodiment of the present invention may be RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, register, hard disk, mobile hard disk, CD-ROM or any other form of storage medium known in the art. A storage medium may be coupled to the processor, so that the processor can read information from the storage medium and write information to the storage medium; or the storage medium may be a component of the processor. The processor and the storage medium may be located in an application specific integrated circuit.

[0122] It should be noted here that the second embodiment and t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a transparent caching traffic optimization method, a load balancer and a storage medium, so as to solve the problems of large response delay and a slow download rate of a service request capable of being cached by a user in a transparent caching process under a cloud / virtualization environment. The method is used for the load balancer in the transparent caching system underthe cloud / virtualization environment, and comprises steps: when a request of a user visiting a website is received, the service nature of the request is determined; and according to the service nature of the request, the transmission path for the service traffic is adjusted, wherein the service traffic comprises the request and file data when the website responds to the request.

Description

Technical field [0001] The present invention relates to the field of network technology, in particular to a method for optimizing traffic of transparent caching in a cloud / virtualized environment, a load balancer and a storage medium. Background technique [0002] In the traditional mode, transparent caching is implemented based on hardware load balancers (SLB, Service Load Balance), cache servers (Cache) and routers (or switches), etc. The traffic optimization on the load balancer is subject to the hardware load balancer , It is impossible to optimize the business level flexibly according to business characteristics. In the cloud / virtualization scenario, all related network elements such as load balancers, cache servers, networks and storage all adopt virtualization technology, and the network element functions are implemented by software. [0003] Such as figure 1 As shown, the existing transparent cache business process is as follows: [0004] Step 1. The user's HTTP request ent...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04L12/803
CPCH04L47/125H04L67/1014H04L67/63H04L67/568H04W28/14H04L12/00
Inventor 李奎史美康尹芹张宇陈伟
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products