Unlock instant, AI-driven research and patent intelligence for your innovation.

No-reference image quality assessment method based on saliency strategy and feature fusion

A technology of reference image and quality evaluation, applied in image analysis, image enhancement, image data processing and other directions, can solve the problem of insufficient consistency of evaluation results, and achieve the effect of improving feature representation ability

Active Publication Date: 2022-05-17
TIANJIN UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Based on the characteristics of Human Visual System (HVS), the present invention proposes a no-reference image quality evaluation method based on saliency strategy and multi-scale feature fusion. The problem of insufficient consistency, the technical solution is as follows:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference image quality assessment method based on saliency strategy and feature fusion
  • No-reference image quality assessment method based on saliency strategy and feature fusion
  • No-reference image quality assessment method based on saliency strategy and feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] The embodiment of the present invention provides a non-reference image quality evaluation method for multi-scale feature fusion based on saliency strategy, such as figure 1 As shown, the method includes the following steps:

[0027] 101: Saliency Filter-Based Image Preprocessing

[0028] The image to be evaluated is preprocessed, converted into a grayscale image and divided into small non-overlapping image blocks, and the saliency analysis algorithm (Graph-based Visual Saliency, GBVS) algorithm based on graph theory is used to calculate the saliency of each image block. sex score. All image blocks are sorted according to the significance score, and the 25% image blocks with the highest score are selected as input samples.

[0029] 102: Extract multi-scale features from processed image blocks

[0030] In the embodiment of the present invention, a convolutional neural network with a two-stream structure is constructed to perform multi-scale feature extraction on the in...

Embodiment 2

[0036] The scheme in embodiment 1 is further introduced below in conjunction with specific calculation formulas and examples, see the following description for details:

[0037] 201: Image Preprocessing Based on Saliency Filters

[0038] For each image, its grayscale image is computed, followed by its saliency matrix. Each pixel is assigned a value ranging from 0 to 255. A higher saliency value represents a salient image pixel. The grayscale image and the corresponding saliency matrix are sliced ​​into image patches of size 32×32. Each image patch is given the same subjective score as the original image. Based on the saliency matrix, the saliency score of each image patch is expressed by the following formula:

[0039]

[0040] In Formula 1, S(m,n) is the saliency value of a pixel at position (m,n) in the i-th image block. M and N represent the size of the image block. The saliency score reflects the degree to which the image block attracts people's attention. The hi...

Embodiment 3

[0054] Below in conjunction with concrete experiment, the scheme in embodiment 1 and 2 is carried out feasibility verification, see the following description for details:

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a non-reference image quality evaluation method based on saliency strategy and feature fusion, which includes the following steps: preprocessing the image to be evaluated, converting it into a grayscale image and dividing it into non-overlapping small image blocks, Blocks are screened, salient image blocks are selected as input samples, and the saliency score of each image block is calculated at the same time; the convolutional neural network with a two-stream structure is used to extract features from the input samples, and one stream focuses on extracting primary features, which is called The primary feature extraction network, another stream with an upsampling network focuses on extracting advanced features, which is called an advanced feature extraction network; the extracted primary features and advanced features are merged, and then mapped through a fully connected layer with 1024 nodes Then obtain the local quality score of the salient image block. Finally, the weight of each salient image block is calculated using the saliency weighting model, and the final prediction score is calculated according to the weight.

Description

technical field [0001] The invention relates to the field of image quality evaluation, in particular to a no-reference evaluation method based on a saliency strategy. Background technique [0002] With the rapid development of digital media technology, digital images have become an important medium for human beings to obtain information. However, in the process of digital image acquisition and transmission, due to the influence of factors such as transmission conditions and inherent properties of equipment, distortion will inevitably occur, which will have a great impact on subsequent processing and applications. Therefore, the development of a high-accuracy image quality evaluation method is of great significance to the development of the digital image field and the communication transmission industry. [0003] Image Quality Assessment (IQA) refers to the overall evaluation of the image quality by the subjects from various perspectives such as comfort and sensory experienc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/00
CPCG06T7/0002G06T2207/20021G06T2207/20081G06T2207/20084G06T2207/30168
Inventor 沈丽丽张楚河侯春萍
Owner TIANJIN UNIV