Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Non-reference image quality evaluation method based on convolutional self-coding network

A convolutional self-encoding and quality evaluation technology, applied in the field of no-reference image quality evaluation, can solve the problems of low image quality sensitivity, no consideration of the integrity of image semantic content, low accuracy of model evaluation results, etc., and achieve accurate results , good results

Active Publication Date: 2019-01-25
XIDIAN UNIV
View PDF7 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that only the image block is used as the input of the network, and the integrity of the semantic content of the image is not considered, so that the accuracy of the evaluation result of the trained model is not high.
The disadvantage of this method is that the method uses manually extracted natural statistical characteristics NSS features to perform score fitting, which makes the extracted features less sensitive to image quality, resulting in evaluation results that cannot better conform to human subjective feeling

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Non-reference image quality evaluation method based on convolutional self-coding network
  • Non-reference image quality evaluation method based on convolutional self-coding network
  • Non-reference image quality evaluation method based on convolutional self-coding network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] The present invention will be further described below in conjunction with the accompanying drawings and simulation experiments.

[0037] Refer to attached figure 1 , to further describe in detail the specific steps of the present invention.

[0038] Step 1. Build a convolutional autoencoder network.

[0039]Build a 17-layer convolutional autoencoder network and set the parameters of each layer of the convolutional autoencoder network; its structure is as follows: input layer→1st convolutional layer→1st pooling layer→2nd convolutional layer→ 2nd pooling layer→3rd convolutional layer→3rd pooling layer→4th convolutional layer→5th convolutional layer→1st deconvolution layer→2nd deconvolution layer → 1st anti-pooling layer → 3rd deconvolution layer → 2nd anti-pooling layer → 4th deconvolution layer → 4th anti-pooling layer → 5th deconvolution layer;

[0040] Set the parameters of each layer of the convolutional autoencoder network as follows:

[0041] Set the number of c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a non-reference image quality evaluation method based on a convolutional self-coding network. The concrete steps of the invention are as follows: constructing a convolutional self-coding network; constructing a fully connected neural network; generating a pre-training set, a training set and a test set; training a convolutional self-coding network and fully connected neuralnetwork; and making row quality evaluation of distorted images in the test dataset. The convolutional self-encoding network is used to encode non-reference images and image blocks thereof respectively, global semantic features and local distortion features are extracted from the coding of non-reference images and their blocks by using the fully connected neural network, and the two features are fused, and the fused features are mapped to perceptual quality scores by using the fully connected neural network. The evaluation results are more consistent with the subjective feelings of people.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to a no-reference image quality evaluation method based on a convolutional self-encoding network in the technical field of digital image processing. The invention can be applied to objectively evaluate the perceptual quality of digital images without original reference images, so as to ensure the validity and accuracy of acquired digital image data. Background technique [0002] In the process of imaging, transmission and storage, digital images are affected by the optical system, compressed transmission and other factors, which will eventually cause the image obtained by the terminal to suffer from various image quality degradation problems such as compression distortion, Gaussian noise, and blur. The perceived quality of images is an important index to compare the performance of various digital image processing algorithms and the parameters of digital image imaging ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06K9/62
CPCG06T7/0002G06T2207/20081G06T2207/20084G06T2207/30168G06F18/253
Inventor 高新波何维佺路文
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products