Virtual reality video quality evaluation method based on double-flow convolutional neural network

A convolutional neural network and virtual reality technology, applied in the field of virtual reality video quality evaluation, can solve the problems of no VR video, normative standards and objective evaluation system, etc., to simplify the process of feature extraction, simple preprocessing method, and time-consuming small effect

Active Publication Date: 2018-07-06
TIANJIN UNIV
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Since virtual reality technology has just emerged in recent years, there is no standard and objective evaluation system for VR video[2]

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual reality video quality evaluation method based on double-flow convolutional neural network
  • Virtual reality video quality evaluation method based on double-flow convolutional neural network
  • Virtual reality video quality evaluation method based on double-flow convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0016] A virtual reality video quality assessment method based on two-stream convolutional neural network, each distorted VR video pair is composed of the left video V l and right video V r Composition, the evaluation method includes the following steps:

[0017] Step 1: Construct difference video V according to the principle of stereo perception d . First grayscale each frame of the original VR video and the distorted VR video, and then use the left video V l with the right video V r Get the required difference video. Compute the sum value video V at the video position (x, y, z) d The value of is shown in formula (1):

[0018] V d (x,y,z)=|V l (x,y,z)-V r (x,y,z)| (1)

[0019] Step 2: According to the characteristics of virtual reality video projection and back-projection, spatially compress video frames at different positions, that is, down-sampling, and down-sample a video frame with a resolution of w×h by s times , to obtain a video frame with a resolution of (w...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a virtual reality video quality evaluation method based on a double-flow convolutional neural network, which comprises the steps of video preprocessing, wherein a VR differential video is obtained through a left view video and a right view video of a VR video, spatial compression is carried out on video frames at different positions, the frames are uniformly extracted fromthe compressed differential video, each frame is cut into blocks in a non-overlapped manner, a VR video patch is formed by the video blocks at the same position of each frame, so that enough data canbe generated for training the convolutional neural network, and meanwhile, optical flow is extracted for each VR video; establishing two convolutional neural network models with the same configuration; respectively taking the VR video patch and the optical flow as input; and obtaining a final objective evaluation score, wherein the scores obtained by two passages of the video patch and the lightflow are averaged and summed, so that a final objective quality evaluation score is obtained. According to the method, the accuracy of the objective evaluation method is improved.

Description

technical field [0001] The invention belongs to the field of video processing and relates to a virtual reality video quality evaluation method. Background technique [0002] As a new simulation and interaction technology-virtual reality (VR) technology is used in many fields such as architecture, games and military, it can create a virtual environment consistent with the rules of the real world, or create a simulation completely out of reality environment, which will bring people a more realistic audio-visual experience and on-the-spot experience [1]. As an important carrier of virtual reality, VR video, also known as panoramic stereoscopic video, plays a huge role. However, in the process of capturing, storing and transmitting VR videos, due to equipment and processing methods, some distortion will inevitably be introduced, which will affect the quality of VR videos. Therefore, it is very important to study an evaluation method that can effectively evaluate the quality of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N17/00H04N13/106
CPCH04N17/00H04N2013/0074
Inventor 杨嘉琛刘天麟
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products