Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Full-reference video quality evaluation method based on motion estimation

A motion estimation and video quality technology, applied in television, computing, image data processing, etc., can solve the problems of video quality degradation, time-consuming and labor-intensive, and inability to evaluate video quality, and achieve enhanced accuracy, enhanced prediction accuracy, and consistency. strong effect

Active Publication Date: 2022-04-12
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF12 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] With the rapid development of network technology, more and more videos are disseminated on the Internet; during the process of shooting, encoding and transmitting videos, various distortions are often introduced, resulting in the degradation of video quality and seriously affecting the viewing of users Experience; how to accurately measure the quality of video has important practical significance for the fields of encoding, video transmission and terminal video quality enhancement
[0003] Since subjective video quality assessment requires a large number of experimenters to score videos one by one, it is time-consuming and laborious, and in actual scenarios, it is difficult to implement subjective quality assessment for each video; at the same time, the objective method for measuring video quality that is often used at present, Such as PSNR, SSIM, they are quite different from the subjective experience of human eyes, and cannot evaluate the video quality well

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Full-reference video quality evaluation method based on motion estimation
  • Full-reference video quality evaluation method based on motion estimation
  • Full-reference video quality evaluation method based on motion estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings. Obviously, the embodiments described below are only a part of the embodiments of the present invention, rather than all embodiments; based on the present invention Embodiments, all other embodiments obtained by persons of ordinary skill in the art without creative effort, all belong to the scope of protection of the present invention.

[0045] This embodiment provides a full-reference video quality assessment method based on motion estimation, the process of which is as follows figure 1As shown, it is divided into three parts: spatial distortion feature extraction, time distortion feature extraction, and space-time distortion feature fusion; the specific steps are as follows:

[0046] Step 1. Spatial distortion feature extraction;

[0047] Step 1.1 The spatial distortion feature adopts the gradient similarity ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of full-reference video quality evaluation, and particularly provides a full-reference video quality evaluation method based on motion estimation, which comprises the following steps: firstly, calculating the change of video content gradient information caused by distortion in space, and obtaining a spatial distortion score value of a video through a two-step time pooling mode; fusing the difference between the local mean value of the motion vector diagram and the standard deviation variable coefficient, the difference between the variable coefficients of the motion vectors at the same position of the adjacent frames and the spatial distortion characteristics to obtain a space-time distortion score value of the video; and finally, fusing the spatial distortion score value and the space-time distortion score value to obtain a final video quality prediction value. According to the method, the motion information is introduced into the video quality evaluation method, so that the accuracy of video quality prediction can be remarkably enhanced, and an objective video quality evaluation method with higher consistency with subjective feeling of human eyes is further obtained.

Description

technical field [0001] The invention belongs to the field of full-reference video quality assessment, and specifically provides a full-reference video quality assessment method using motion estimation. Background technique [0002] With the rapid development of network technology, more and more videos are disseminated on the Internet; during the process of shooting, encoding and transmitting videos, various distortions are often introduced, resulting in the degradation of video quality and seriously affecting the viewing of users Experience; how to accurately measure the quality of video has important practical significance for the fields of encoding, video transmission and terminal video quality enhancement. [0003] Since subjective video quality assessment requires a large number of experimenters to score videos one by one, it is time-consuming and laborious, and in actual scenarios, it is difficult to implement subjective quality assessment for each video; at the same ti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06T7/13H04N17/00
Inventor 朱树元胡术明曾兵
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products