Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

A virtual cache sharing method and system

A virtual cache and cache technology, applied in the field of virtual cache sharing method and system, can solve the problem that cache content cannot be shared, and achieve the effect of simple and convenient realization

Active Publication Date: 2020-06-16
北京中科海网科技有限公司
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to solve the problem that cached content on a single network device between multiple virtual networks cannot be shared

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A virtual cache sharing method and system
  • A virtual cache sharing method and system
  • A virtual cache sharing method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0116] Such as Figure 7 As shown, two virtual networks are formed on this network, one virtual network supports NDN (Named DataNetworking), and the other virtual network supports MF (Mobility First).

[0117] In this diagram, refer to the attached Figure 6 , device 1, device 2 and device 3, the cache interface is registered, and the attached Figure 5 The flow entry shown. The present invention will be described below in conjunction with this scenario.

[0118] Assume that the NDN content request message arrives at device 1, and the content name is " / publisherA / content / photo1". Device 1 according to attached Figure 5 The flow entry sends the content request packet to the storage device through the cache interface.

[0119] storage device attachment figure 2 For processing, it is assumed that " / publisherA / content / photo1" is hashed and hash-name="12345" (for convenience of processing, indicated). After searching locally, find the file named "12345", read the content o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A virtual cache sharing method, the method relates to a controller, a network device, and a storage device connected to the network device, comprising: step 1) the network device expands through the OpenFlow protocol, and registers the interface connected to the specified virtual cache with the controller; Step 2) Configure the network device by the controller so that it sends the message for accessing the virtual cache to the virtual cache interface; Step 3) after the storage device receives the message for accessing the virtual cache from the virtual cache interface, the contents of the message are processed Request sending or data storage processing; the present invention also proposes a virtual cache sharing system, including a cache interface registration module, a configuration module and a content request or data processing module, the cache interface registration module is set on the network device; the configuration module is set on On the controller, the content request and data processing modules are set on the storage device. The invention can realize the cache content sharing on the single network device among multiple virtual networks.

Description

technical field [0001] The invention relates to the network technical field of multi-network cache sharing under a virtual network environment, and in particular to a virtual cache sharing method and system. Background technique [0002] Caching is a traditional practice to speed up data access. In the process of computer development, in order to solve the contradiction between high-speed CPU operation and low-speed I / O devices, researchers designed the first-level and second-level caches, which perfectly solved this problem through prefetch or post-write operations. In a distributed system, such as a distributed file system, there is a problem of inconsistency between the data storage location and the access location. In essence, it is still a problem of speed mismatch, and caching technology is almost universally used. As a master of distributed systems, the Internet has experienced multiple stages from birth, academic application, and commercial operation, and the applic...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/08
CPCH04L67/568
Inventor 王玲芳王劲林曾理
Owner 北京中科海网科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products