A no-reference video quality assessment method based on feature fusion and recurrent neural network

A technology of cyclic neural network and feature fusion, which is applied in TV, electrical components, image communication, etc., can solve problems such as poor quality evaluation performance, and achieve the effect of large detection quality range, accurate quality evaluation, and accurate quality evaluation indicators.

Active Publication Date: 2021-06-11
COMMUNICATION UNIVERSITY OF CHINA
View PDF6 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] Aiming at the problem of poor performance of no-reference video quality evaluation in the existing video quality evaluation, the present invention proposes a no-reference objective quality evaluation method. The present invention divides video into video segments, and each video segment is obtained through a feature fusion network. The overall feature vector of the segment, and then send the feature vectors of all video segments of a video to a recurrent neural network to obtain the overall quality score of the video, and complete the quality evaluation process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A no-reference video quality assessment method based on feature fusion and recurrent neural network
  • A no-reference video quality assessment method based on feature fusion and recurrent neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0047] The flow chart of the implementation is as figure 1 shown, including the following steps:

[0048] Step S10, extracting and cropping video segments;

[0049] Step S20, building and training a feature fusion network;

[0050] Step S30, obtaining the feature vector representation of the video;

[0051] Step S40, building and training a recurrent neural network;

[0052] Step S50, evaluating the quality of the video;

[0053] The extraction and cropping video segment adjustment step S10 of the embodiment also includes the following steps:

[0054] Step S100, extracting video frames, selecting video frames at equal intervals, and discarding other video frames directly due to redundancy;

[0055] Step S110, cutting video frames, cutting each video frame into image blocks by means of windowing, and setting M image blocks for one frame;

[0056] Step S120, combining the trimmed image blocks, in the video sequence, randomly select N starting points, continuously take T fr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a no-reference video quality evaluation method based on feature fusion and cyclic neural network. The method fuses spatio-temporal features through a feature fusion network that can input video segments, and uses a cyclic neural network to fuse the quality of different video segments to complete the process. Overall video quality assessment task. The neural network used in the present invention directly uses video segments as input and adopts a feature fusion network. This design can better extract the direct relationship between video frames, thereby obtaining the overall quality evaluation index of the video more accurately. The feature fusion network of the present invention can process multiple frames at one time and obtain a low-dimensional feature, that is, the feature scale is greatly reduced compared with the amount of data, and the total time can be obtained extremely quickly for a whole section of video during operation. big reduction.

Description

technical field [0001] The invention relates to a no-reference video quality evaluation method based on feature fusion and cyclic neural network, belonging to the technical field of digital video processing. Background technique [0002] As a complex source of visual information, video contains a lot of valuable information. The quality of video directly affects people's subjective feelings and information acquisition, and can be used to measure other video tasks such as video compression. The research on Video Quality Assessment (VQA) has also received extensive attention in recent years. [0003] Video quality evaluation can be divided into subjective evaluation methods and objective evaluation methods. Subjective evaluation involves observers subjectively scoring the video quality, but the subjective evaluation workload is heavy, time-consuming, and inconvenient; the objective evaluation method is to calculate the quality index of the video by the computer according to a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04N17/00
CPCH04N17/00
Inventor 史萍侯明潘达应泽峰韩明良
Owner COMMUNICATION UNIVERSITY OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products