Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video redirection quality evaluation method based on space-time saliency classification and fusion

A quality evaluation and redirection technology, applied in image analysis, character and pattern recognition, image data processing, etc., can solve the problems of difficult to match evaluation scores, single evaluation index, difficult to redirect algorithm performance improvement, etc. Effect

Active Publication Date: 2021-08-31
GUANGXI UNIV
View PDF7 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the disadvantages of subjective evaluation: 1) The testers participating in the subjective evaluation usually directly judge the quality of the redirected video based on their own subjective feelings, and it is difficult to quantitatively analyze and evaluate the impact of various distortions on the quality of the reconstructed video during the redirection process. The evaluation process and results are often difficult to directly apply to the performance improvement of the redirection algorithm; 2) Subjective evaluation requires a large number of testers to vote repeatedly on the combination of redirected videos. Influenced by factors such as observation environment and observation environment, it is difficult to directly embed the results of subjective evaluation into emerging real-time video applications, and the portability is low.
Although this method can evaluate the overall quality of the redirected video, it is difficult to evaluate the geometric distortion of the video space well, and cannot evaluate the temporal distortion of the target still video.
To sum up, the existing objective evaluation algorithms for video redirection quality have single evaluation indicators, do not fully consider the characteristics of HVS, are difficult to apply to different types of videos, and often use fixed index weight fusion methods, resulting in evaluation scores Difficult to conform to the results of human subjective perception

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video redirection quality evaluation method based on space-time saliency classification and fusion
  • Video redirection quality evaluation method based on space-time saliency classification and fusion
  • Video redirection quality evaluation method based on space-time saliency classification and fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0087] In order to fully consider the characteristics of HVS, different evaluation methods are adopted for videos with different content, so as to ensure the effectiveness of the objective evaluation algorithm for video redirection quality. Such as figure 1 As shown, the present invention proposes a video redirection quality evaluation method based on space-time saliency classification and fusi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video redirection quality evaluation method based on space-time saliency classification and fusion. The method comprises the following steps: constructing a video classification model according to space-time saliency of videos, and dividing the videos into four classes according to motion information and foreground information of original videos; extracting salient information, edge features, foreground information, motion features and the like of the original video, and evaluating the quality of the redirected video by adopting four spatio-temporal indexes, namely perceptual geometric distortion, edge group similarity, time continuity similarity distortion and important target time distortion; adopting different self-adaptive weight endowing methods for different types of videos, and fusing the quality scores of the four spatio-temporal indexes to obtain the overall objective quality of the redirected video. An index weight adaptive fusion mode is adopted for the classified videos, different space-time characteristics are extracted for different types of videos, different index weight adaptive fusion methods are adopted, the characteristics of the videos are fully considered, and the performance of an objective evaluation algorithm is greatly ensured.

Description

technical field [0001] The invention belongs to the technical field of image evaluation, and more particularly relates to a video redirection quality evaluation method based on space-time saliency classification and fusion. Background technique [0002] With the rapid popularization of multimedia display devices such as LCD TVs, tablets, laptops and smart phones, and the explosive growth of video data, people can watch media videos anytime and anywhere. However, due to different application requirements, terminal display devices often do not have a uniform resolution and aspect ratio, and the same original video will be stretched or squeezed to different degrees when displayed on terminals with different resolutions and aspect ratios. Not only will it cause a waste of display screen space, but it will also seriously affect the user's viewing experience. In order to avoid this phenomenon, it is necessary to adjust the video content, adaptively adjust its resolution or aspect...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/11G06T7/13G06T7/194G06K9/32G06K9/62G06N20/00
CPCG06T7/0002G06T7/11G06T7/13G06T7/194G06N20/00G06T2207/10016G06T2207/20104G06V10/25G06F18/25G06F18/241
Inventor 唐振华董伟鑫赵祖翌李喆覃团发
Owner GUANGXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products