Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

VR video edge prefetching method and system based on reinforcement learning in C-RAN architecture

A reinforcement learning and video technology, applied in the field of computer networks, can solve the problems of catastrophic VR video viewing experience and strong dizziness

Active Publication Date: 2021-01-08
UNIV OF SCI & TECH OF CHINA
View PDF11 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example, transmitting 8K panoramic VR video requires a bandwidth of more than 260Mbps, and if you want to obtain a more extreme experience effect, you need a bandwidth of more than 10Gbps, which is a huge challenge for the current network, especially the backbone network of the video source server. challenge
On the other hand, users are very sensitive to the delay of VR videos. Generally speaking, if the delay exceeds 20ms, they will have a strong sense of dizziness, which is disastrous for the viewing experience of VR videos.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • VR video edge prefetching method and system based on reinforcement learning in C-RAN architecture
  • VR video edge prefetching method and system based on reinforcement learning in C-RAN architecture
  • VR video edge prefetching method and system based on reinforcement learning in C-RAN architecture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0043]C-RAN is a new type of radio access network architecture, such as figure 1 shown. The overall goal of C-RAN is to solve the various challenges (energy consumption, construction and operation and maintenance costs, spectrum resources, etc.) brought to operators by the rapid development of mobile Internet, and pursue sustainable business and profit growth in the future. In the C-RAN architecture, it is assumed that the VR video source server has all the c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a VR video edge prefetching method and system based on reinforcement learning in a CRAN architecture. The method comprises the steps of collecting network throughput, user request information and cache state information in real time; determining the user experience quality of the single user based on the video quality, the video time domain jitter, the video space domain jitter and the time delay, and predicting the user experience quality gain of the single user; determining user experience quality gains of multiple users based on the user experience quality gain of the single user; optimizing the user experience quality gain of multiple users based on a reinforcement learning algorithm; and performing edge prefetching on the VR video based on the network throughput, the user request information, the cache state information and the optimized user experience quality gain of the multiple users. According to the method and the device, the time delay can be reducedby dynamically pre-fetching the multi-level cache in the C-RAN, and repeated data propagation is reduced, so that more comfortable VR video watching experience is provided for a user.

Description

technical field [0001] The present invention relates to the technical field of computer networks, in particular to a VR (Virtual Reality, virtual reality) video edge prefetching method and system based on reinforcement learning in a C-RAN (Cloud Radio Access Network, cloud radio access network) architecture. Background technique [0002] With the development of VR technology, VR has gradually entered thousands of households, providing users with immersive video viewing experience, and also playing an indispensable role in the fields of educational interaction, industrial remote guidance, and telemedicine. According to statistics, as of 2019, the number of VR users in China has exceeded 10 million, and the industry revenue of virtual reality software and hardware has exceeded 1 billion yuan. It is foreseeable that virtual reality technology will flourish in the future and will expand to more In the field of application, it can provide users with a more realistic, high-definit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04L29/06H04N13/122H04W24/02H04W24/08
CPCH04L65/80H04W24/02H04W24/08H04N13/122
Inventor 谭小彬王顺义徐磊李思敏杨坚郑烇
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products