Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

No-reference image quality monitoring method based on channel recombination and feature fusion

A reference image and quality monitoring technology, applied in image enhancement, image analysis, image data processing, etc., can solve the problems of inconsistent results with human eyes, insufficient data samples, etc., achieve high consistency and ensure the effect of spatial integrity

Active Publication Date: 2021-04-30
TIANJIN UNIV
View PDF12 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention provides a no-reference image quality monitoring method based on channel reorganization and feature fusion, which effectively simulates the visual perception process and solves the problems of insufficient data samples and inconsistent results with human eyes in the process of image quality monitoring. See the description below for details :

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference image quality monitoring method based on channel recombination and feature fusion
  • No-reference image quality monitoring method based on channel recombination and feature fusion
  • No-reference image quality monitoring method based on channel recombination and feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] The embodiment of the present invention provides a no-reference quality monitoring algorithm based on channel reorganization and feature fusion, such as figure 1 As shown, the method includes the following steps:

[0034] 101: Preprocessing the image;

[0035] Preprocess the image, convert it into a grayscale image and divide it into non-overlapping small image blocks as input samples, and use a graph-based saliency analysis algorithm (Graph-based Visual Saliency, GBVS) to calculate each original image The significance score of the block.

[0036] 102: Perform feature extraction on the processed image;

[0037] Feature extraction is performed on input samples using a convolutional neural network. The input of the network is the 32×32 image block obtained in step 101 . Each input image block undergoes feature extraction to obtain a feature map with a size of 4×4×128.

[0038] 103: Reorganize and segment the extracted features in the channel dimension;

[0039] For ...

Embodiment 2

[0046] The scheme in embodiment 1 is further introduced below in conjunction with specific calculation formulas and examples, see the following description for details:

[0047] 201: image preprocessing;

[0048] Compute the saliency matrix for each image. Each pixel is assigned a value ranging from 0 to 255, with higher saliency values ​​indicating a salient image pixel. The grayscale image and the corresponding saliency matrix are segmented into image patches of size 32×32. Based on the saliency matrix, the saliency score of each image patch is calculated by the following formula:

[0049]

[0050] In formula (1), S(m,n) is the saliency value of a pixel at position (m,n) in the i-th image block. M and N represent the size of the image block. The saliency score reflects the degree to which the image block attracts people's attention. The higher the saliency score of an image patch, the greater its impact on human judgment.

[0051] 202: Perform feature extraction on ...

Embodiment 3

[0068] Below in conjunction with concrete experiment, the scheme in embodiment 1 and 2 is carried out feasibility verification, see the following description for details:

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a no-reference image quality monitoring method based on channel recombination and feature fusion. The method comprises the steps: carrying out the preprocessing of an image, and carrying out the feature extraction of the processed image; recombining and segmenting the extracted features in the channel dimension; and performing fusion and quality prediction on the segmented features under the guidance of the saliency model, taking an average value of the obtained quality scores as a quality prediction score of the image, applying the quality prediction score to the field of image quality monitoring, and controlling the image quality. The visual perception process is effectively simulated, and the problems that data samples are insufficient and results are inconsistent with human eyes in the image quality monitoring process are solved.

Description

technical field [0001] The invention relates to the field of image quality monitoring, in particular to a no-reference image quality monitoring method based on channel reorganization and feature fusion. Background technique [0002] In the past few decades, digital image technology and multimedia applications have achieved rapid development, and more and more related products have been applied in people's lives. However, digital images will inevitably be distorted in the process of collection, compression, transmission, storage, etc., which will affect people's visual experience. Therefore, it is very necessary to monitor the image quality. [0003] To monitor the quality of an image, it is first necessary to evaluate the quality of the image. Image Quality Assessment (IQA) refers to the comprehensive evaluation of the image quality by the subjects from various perspectives such as comfort and sensory experience. The specific indicators include: image quality, image clarit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/11G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06T7/0002G06T7/11G06N3/08G06T2207/20081G06T2207/20084G06T2207/30168G06V10/40G06N3/045G06F18/253
Inventor 沈丽丽张楚河侯春萍
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products