Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Reinforcement learning-based edge prefetching method and system for vr video in c-ran architecture

A reinforcement learning and video technology, applied in the field of computer networks, can solve the problems of catastrophic VR video viewing experience, strong dizziness, etc., to achieve a comfortable VR video viewing experience, reduce repeated data transmission, and reduce the effect of delay.

Active Publication Date: 2021-10-01
UNIV OF SCI & TECH OF CHINA
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example, transmitting 8K panoramic VR video requires a bandwidth of more than 260Mbps, and if you want to obtain a more extreme experience effect, you need a bandwidth of more than 10Gbps, which is a huge challenge for the current network, especially the backbone network of the video source server. challenge
On the other hand, users are very sensitive to the delay of VR videos. Generally speaking, if the delay exceeds 20ms, they will have a strong sense of dizziness, which is disastrous for the viewing experience of VR videos.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reinforcement learning-based edge prefetching method and system for vr video in c-ran architecture
  • Reinforcement learning-based edge prefetching method and system for vr video in c-ran architecture
  • Reinforcement learning-based edge prefetching method and system for vr video in c-ran architecture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0043]C-RAN is a new type of radio access network architecture, such as figure 1 shown. The overall goal of C-RAN is to solve the various challenges (energy consumption, construction and operation and maintenance costs, spectrum resources, etc.) brought to operators by the rapid development of mobile Internet, and pursue sustainable business and profit growth in the future. In the C-RAN architecture, it is assumed that the VR video source server has all the c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a VR video edge prefetching method and system based on reinforcement learning in a C-RAN architecture. The method includes: collecting network throughput, user request information and cache status information in real time; based on video quality, video time domain jitter, Video spatial jitter and delay determine the user quality of experience of a single user, and predict the user quality of experience gain of a single user; determine the user quality of experience gain of multiple users based on the user quality of experience gain of a single user; Optimize the quality of experience gain; edge prefetch for VR video based on network throughput, user request information, cache status information and optimized multi-user user quality of experience gain. The present invention can adopt the method of dynamically prefetching multi-level cache in C-RAN to reduce time delay and reduce repeated data transmission, thereby providing users with a more comfortable VR video viewing experience.

Description

technical field [0001] The present invention relates to the technical field of computer networks, in particular to a VR (Virtual Reality, virtual reality) video edge prefetching method and system based on reinforcement learning in a C-RAN (Cloud Radio Access Network, cloud radio access network) architecture. Background technique [0002] With the development of VR technology, VR has gradually entered thousands of households, providing users with immersive video viewing experience, and also playing an indispensable role in the fields of educational interaction, industrial remote guidance, and telemedicine. According to statistics, as of 2019, the number of VR users in China has exceeded 10 million, and the industry revenue of virtual reality software and hardware has exceeded 1 billion yuan. It is foreseeable that virtual reality technology will flourish in the future and will expand to more In the field of application, it can provide users with a more realistic, high-definit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): H04L29/06H04N13/122H04W24/02H04W24/08
CPCH04L65/80H04W24/02H04W24/08H04N13/122
Inventor 谭小彬王顺义徐磊李思敏杨坚郑烇
Owner UNIV OF SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products