Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

No-reference contrast distortion image quality evaluation method

A technology for image quality evaluation and distorted images, applied in image enhancement, image analysis, image data processing, etc., can solve problems such as poor performance, and achieve the effect of ensuring accuracy and effectiveness

Active Publication Date: 2019-12-13
SUN YAT SEN UNIV
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The present invention provides a non-reference contrast distortion image quality evaluation method in order to overcome the lack of evaluation of contrast distortion types in the existing non-reference image quality evaluation field and the technical defects of poor performance in the application of general-purpose evaluation methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference contrast distortion image quality evaluation method
  • No-reference contrast distortion image quality evaluation method
  • No-reference contrast distortion image quality evaluation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0053] Such as figure 1 As shown, a no-reference contrast distortion image quality evaluation method includes the following steps:

[0054]S1: Extract color moments and information entropy features in various color spaces for contrast-distorted images, and construct a feature set describing image distortion;

[0055] S2: Construct a training set based on the feature set of image distortion and the prior score, and construct a prediction model for image quality evaluation;

[0056] S3: extract the contrast distortion feature set of the image to be evaluated, and use the image quality evaluation prediction model to perform calculations to predict the image quality of the image to be evaluated.

[0057] In the specific implementation process, a kind of non-reference contrast distortion image quality evaluation method provided by the present invention first converts the input image from RGB to CIELab color space, and uses color moment and information entropy as the representation...

Embodiment 2

[0059] More specifically, on the basis of Example 1, for figure 2 As shown in the image for evaluation, the step S1 includes the following steps:

[0060] S11: Convert the image from the RGB color space to the XYZ color space, and then convert the image from the XYZ color space to the CIELab color space; wherein, the three channels of the RGB color space are respectively recorded as: R, G, B; the CIELab color space The three channels are respectively marked as: L, a, b;

[0061] S12: Extract the first to third-order central color moment features from the six color channels obtained in step S11, denoted as Wherein, i={1, 2, 3} represents the order of the color moments, and j={R, G, B, L, a, b} represents the color channels of different color spaces;

[0062] S13: Extract information entropy features from the 6 color channels obtained in step S11, denoted as H j , use j={R, G, B, L, a, b} to represent the color channels of different color spaces.

[0063] More specifically...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a no-reference contrast distortion image quality evaluation method, which comprises the following steps: extracting color moments and information entropy features in various color spaces from a contrast distortion image, and constructing a feature set for describing image distortion; constructing a training set according to the feature set of image distortion and the prior score, and constructing a prediction model of image quality evaluation; and extracting a contrast distortion feature set of the to-be-evaluated image, and performing calculation by utilizing the imagequality evaluation prediction model to predict the image quality of the to-be-evaluated image. According to the no-reference contrast distorted image quality evaluation method provided by the invention, a multi-color space is fused, color moment and information entropy features are combined, the detection accuracy and effectiveness are well ensured, and the vacancy in the no-reference contrast distorted image quality evaluation field is filled.

Description

technical field [0001] The invention relates to the technical field of digital image forensics, and more specifically, to a method for evaluating image quality without reference contrast distortion. Background technique [0002] With the rapid development of electronic technology and the rapid popularization of digital imaging equipment, digital images have been widely used in people's daily office, study and life. Digital images have become an important carrier of information, and play an irreplaceable role in various fields, such as military, network, archaeology, justice, etc. At the same time, with the rapid development of various types of editing software, ordinary users can easily use these tools to edit , modify, beautify the image. If these edited and tampered digital images are regarded as important information, it is likely to mislead people and cause adverse effects on people's lives and even the entire society. Therefore, the research on related forensics digit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06T7/90G06K9/62
CPCG06T7/0002G06T7/90G06T2207/10024G06T2207/30168G06F18/2411G06F18/2414G06F18/253Y02T10/40
Inventor 卢伟吕文静
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products