Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for evaluating objective quality of full-reference image

An evaluation method, objective quality technology, applied in the field of digital video, to achieve the effect of wide application prospects

Active Publication Date: 2010-01-06
ZHEJIANG UNIV
View PDF0 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] The image quality evaluation method based on structural similarity proposed in literature [3] and [4] is based on the theory of literature [5], which more truly reflects the quality of visual perception. However, these two methods ignore the The effect of severe distortion on the choice of visual attention focus is proposed: in the process of observing images, the focus of visual attention is shifted by the control of human subjective will over time; the factors that determine the shift of visual attention focus can be significant Visual features or severe visual distortion; that is to say, usually the areas with prominent visual features in the image will attract people's attention first. Significant regions are shifted to heavily distorted regions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for evaluating objective quality of full-reference image
  • Method for evaluating objective quality of full-reference image
  • Method for evaluating objective quality of full-reference image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Such as figure 1 As shown, a method for evaluating the objective quality of a full-reference image based on structural similarity in the present invention includes:

[0043] (1) Using spatial domain visual features such as brightness contrast, texture complexity, and spatial position, to obtain the visual perception map of the original image, and find the significant position of visual perception features;

[0044] (2) Find the structural similarity map SSIM(i, j) between the original image and the distorted image, where (i, j) is the pixel coordinates, calculate the relative quality of the distorted image, and find the location with serious distortion;

[0045] (3) Define the principle of shifting the focus of visual attention, determine a new focus of visual attention, and regenerate the visual perception map after the shift of focus of visual attention;

[0046] (4) Weight the structural similarity with the visual perceptual maps generated in (1) and (3) to obtain a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for evaluating the objective quality of a full-reference image based on structure similarity, comprising the following steps: firstly, utilizing the space domain visual characteristic to obtain a visual perceptual map of an original image and solve for a position at which visual perception is remarkable; secondly, utilizing an evaluating method based on the structure similarity to solve for a structure similarity drawing SSIM (i, j) between the original image and a distorted image, calculating relative quality of the distorted image, and solving for a position of severe distortion; thirdly, defining the principle of visual attention focus transfer, ensuring a new visual attention focus, and producing a new visual perceptual map after the visual attention focus transfer; and fourthly, obtaining the objective evaluation of the image quality by using the weighted structure similarity of the produced visual perceptual map. The method is suitable for design of various image coding and treating algorithm as well as comparison of different algorithm effects, more accords with a human subjective evaluation on the aspect of an evaluating result of the image, and has a wide application prospect.

Description

technical field [0001] The invention relates to the technical field of digital video, in particular to a full-reference image objective quality evaluation method based on structural similarity. Background technique [0002] Digital images are widely used in multimedia products, which will cause quality loss in the process of acquisition, compression, storage and transmission. While using these multimedia products, human beings become the final recipients of digital images. Therefore, the human subjective quality evaluation (DMOS) of digital images is considered to be the most reliable. However, the process of subjective quality assessment is time-consuming and laborious, and the results are not reproducible. Therefore, scientists have done a lot of research on the objective quality evaluation methods of digital images for many years. According to the reference to the original image, the objective quality evaluation methods are divided into three types: full reference type...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N17/00
Inventor 陈耀武张桦田翔
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products