Unlock instant, AI-driven research and patent intelligence for your innovation.

Cache processing method, device and system

A cache processing and caching technology, applied in the Internet field, can solve problems such as repeated storage

Active Publication Date: 2019-03-26
HUAWEI TECH CO LTD
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the existing cache decision strategy, when a data packet passes through each network element node, each network element node will cache the corresponding content in the data packet under the condition that the storage space of the cache area allows, the existing cache decision strategy It will cause a large amount of repeated storage of compatible content in each network element node, which in turn will cause

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cache processing method, device and system
  • Cache processing method, device and system
  • Cache processing method, device and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 3

[0176] It should be noted that, in the third embodiment, the cache processing method compares the stay time of a certain data content on the local node, the stay time of the data content on the next-hop node, and the data content prefix corresponding to the data content The residence time of the last hop node is used to judge whether the data content can be cached in the cache area of ​​this node. The third implementation will be described below.

[0177] exist figure 2 based on, further, for figure 2 In step 103, the interest packet also includes: the stay time of the previous hop node.

[0178] exist figure 2 Based on the provided cache processing method, step 106 specifically includes:

[0179] Query whether there is an entry corresponding to the data content prefix information in the PIT, and if so, add the sending interface identifier corresponding to the data content prefix information and the stay time of the previous hop node corresponding to the data content pr...

Embodiment 4

[0214] It should be noted that, in the fourth embodiment, the cache processing method judges whether the data content can be cached in the in the cache of this node. The fourth implementation will be described below.

[0215] exist figure 2 Based on the provided cache processing method, step 106 specifically includes:

[0216] The node inquires whether there is an entry corresponding to the data content prefix information in the PIT, and if so, adds the sending interface identifier corresponding to the data content prefix information to the entry.

[0217] If not, a new entry is created, and the sending interface identifier corresponding to the data content prefix information is added to the new entry.

[0218] Query the forwarding information base FIB, and forward the Interest packet to the next hop node.

[0219] Specifically, as shown in Table 2 below, the node queries the PIT. If there is an entry of data content prefix information j in the PIT, it will add the sendin...

Embodiment 5

[0251] It should be noted that, in the fourth embodiment, the cache processing method judges the data content by comparing the stay time of a certain data content on this node with the stay time of the previous hop node corresponding to the data content prefix of the data content Whether it can be cached in the cache area of ​​this node. The third implementation will be described below.

[0252] exist figure 2 based on, further, for figure 2 In step 103, the interest packet also includes: the stay time of the previous hop node.

[0253] exist figure 2 Based on the provided cache processing method, step 106 specifically includes:

[0254] Query whether there is an entry corresponding to the data content prefix information in the PIT, and if so, add the sending interface identifier corresponding to the data content prefix information and the stay time of the previous hop node corresponding to the data content prefix information to the entry;

[0255] If not, create a new...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a cache processing method, device and system, wherein the cache processing method is characterized in that it includes: receiving the first data packet containing data content sent by the next-hop node to the requester; The stay time of the node is compared with the stay time of other nodes corresponding to the data content, if the stay time of this node is the longest, the data content is cached in the cache area; the first data packet is sent to the requester. Therefore, according to the length of stay of the data content in different nodes, the data content is cached, which avoids the waste of cache space resources caused by repeated storage of the same data content by each node, thereby improving the utilization rate of cache space resources. , which enriches the diversity of the cached data content, improves the hit rate of the requester obtaining the data content through this node, and reduces the total delay.

Description

technical field [0001] The present invention relates to Internet technology, in particular to a cache processing method, device and system. Background technique [0002] With the rapid development of Internet technology and the rapid growth of the number of Internet users, the existing Internet based on Transmission Control Protocol (Transmission Control Protocol, referred to as: TCP) / Internet Protocol (Internet Protocol, referred to as: IP) has gradually exposed many problems. . Among them, a key development direction is to deploy cache in Internet network elements. [0003] In the prior art, the cache decision strategy adopts ALWAYS, and ALWAYS is a strategy for caching all content passing through a buffer of a certain network element node. Specifically, firstly, after each network element node receives an interest packet containing content prefix information, the network element node i queries whether the content corresponding to the content prefix information in the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L12/751H04L12/861H04L45/02
Inventor 王国卿黄韬刘江倪慧
Owner HUAWEI TECH CO LTD