A method and system for video caching based on cooperation among multi-cache servers

A caching server and server technology, applied in transmission systems, digital transmission systems, electrical components, etc., can solve the problems of multi-system storage space, video storage redundancy, occupancy, etc., and achieve the effect of reducing cache space and improving utilization

Inactive Publication Date: 2018-12-18
BEIJING JIAOTONG UNIV
View PDF2 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Each video has many high-definition versions. In the existing video caching scheme, each video version is regarded as an independent individual. A video may be stored

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and system for video caching based on cooperation among multi-cache servers
  • A method and system for video caching based on cooperation among multi-cache servers
  • A method and system for video caching based on cooperation among multi-cache servers

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0049] Example one

[0050] The processing flow of a video caching method based on cooperation among multiple caching servers provided in this embodiment is as follows: image 3 As shown, including the following processing steps:

[0051] Step S310: Construct a cache network including multiple base stations, each base station is connected to a corresponding cache server with video transcoding function, and each base station is also connected to a video source server through a wireless communication network.

[0052] Construct a cache network that includes multiple base stations. Each base station is connected to a corresponding cache server with video transcoding function. Each base station uses the corresponding cache server to provide video resources for mobile user terminals in its coverage area. Cooperatively transmit video resources. Each base station is also connected to a video source server that stores all video files. The data transmission bandwidth between each base station...

Example Embodiment

[0057] Example two

[0058] Figure 4 The processing flowchart of a video transcoding-based caching method based on collaboration between caching servers provided for this embodiment includes the following processing steps:

[0059] Step 1. Initialization. Initialize the existing caching strategy X is empty, X is the set of each video caching scheme, X={x ki =j|k∈N,i∈V,j∈Q+0}, where j=0, it means that video i is not cached in cache server k, and when j≠0, it means that video i is cached in cache server k The video version with resolution j. N={1,2,...K} is the set of cache servers, V={1,2...I} is the video set, Q={1,2...J} is the definition set; the cache space of each cache server Size C k , Where k∈N; the unit data transmission delay between any base stations is d kk’ , The unit data transmission delay from the video source server to each base station is d k , The size of the video version (i, j) whose resolution of video i is j is S ij ; When the user obtains the video file...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method and a system for video caching based on cooperation among multi-cache servers. The method comprises the following steps: according to the data transmission delay from abase station to a corresponding buffer server, the data transmission delay between different base stations, and the data transmission delay from a video source server to each base station, the overall time delay satisfying the video requirements of all user terminals is calculated under the condition of the existing video buffer strategy of the buffer network; the video file is moved to a new candidate buffer location to constitute a candidate video buffer strategy. The overall delay corresponding to all candidate video buffer strategies is compared with the overall delay corresponding to theexisting video buffer strategies, and the optimal video buffer strategy of the buffer network is determined according to the comparison results. The method of the invention comprehensively utilizes the characteristics of video transcoding to reduce the buffer space, and further extends the buffer space to a mechanism in which multiple buffer servers can cooperate with each other, thereby effectively improving the utilization rate of video resources.

Description

technical field [0001] The invention relates to the technical field of resource allocation in communication networks, in particular to a video caching method and system based on cooperation between multiple caching servers. Background technique [0002] In recent years, users' demand for online video viewing has increased dramatically, which has put enormous pressure on network bandwidth. A common method to alleviate bandwidth pressure is to deploy cache servers near end users to form a video cache system. In this video caching system, the video files requested by users can be stored in their local cache server in advance, so that users can obtain the video service through the base station connected to the cache server without having to pay a large communication delay cost from the video source server. Obtaining video resources alleviates the bandwidth pressure of the video source server to a certain extent. [0003] In the existing video caching solution in the network, t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04N21/222H04N21/231H04N21/647H04L12/26
Inventor 李纯喜赵永祥赵红娜
Owner BEIJING JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products