No-reference image quality evaluation method based on convolutional neural network

A technology of convolutional neural network and reference image, which is applied in the application field of generative confrontation network in image quality evaluation, can solve the problems of low performance and poor performance of no reference evaluation method, and achieve the effect of fast training speed and good experimental effect

Pending Publication Date: 2021-05-07
HANGZHOU DIANZI UNIV
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Most of the existing no-reference quality evaluation methods are evaluation methods with known subjective quality scores. Such methods usually require a large number of training sample images and corresponding subjective scores to train th

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference image quality evaluation method based on convolutional neural network
  • No-reference image quality evaluation method based on convolutional neural network
  • No-reference image quality evaluation method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0019]The present invention will be further described below.

[0020]A non-reference image quality evaluation method based on convolutional neural network, the specific implementation steps are as follows:

[0021]Step 1: Preprocessing the distortion map and the natural map to get a similar image;

[0022]1-1. Calculate the brightness comparison:

[0023]For distortion diagrams: x, natural map: y, adoptionIndicates the brightness information of the distortion diagram x,Indicates the brightness information of the natural map Y:

[0024]

[0025]Where XiYiThe pixel point value of the image.

[0026]Then the brightness comparison of the distortion diagram X and the natural map Y can be represented as:

[0027]

[0028]Where C1It is to prevent the minimum number of points set by 0.

[0029]1-2. Comparative contrast contrast: c (x, y)

[0030]ΣxContrast information indicating distortion diagram x, σyRepresents contrast information of natural map Y:

[0031]

[0032]The contrast of the distortion diagram X and the natural map ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a no-reference image quality evaluation method based on a convolutional neural network. The method comprises the steps: preprocessing a distortion graph and a natural graph to obtain a similar graph, and then constructing a neural network according to the distortion graph and the similar graph; on the basis of the confrontation generation concept of a GAN framework, in a generation network part, integrating the jump connection characteristic of a U-net framework and the denseblock structure characteristic of a densenet framework; in the network discrimination part, adopting a simple classification network; and finally training the constructed neural network. According to the method, the characteristics of the GAN network, the U-net network and the densenet network are respectively absorbed and combined, a more effective neural network is constructed, conversion and migration from graphs to graphs are more effectively realized, a better result is obtained in graph-to-graph realization, and the simulated quality score and the real quality score have strong correlation and smaller error.

Description

technical field [0001] The invention belongs to the field of image processing, designs an image quality evaluation method, and relates to the application of a generation confrontation network in deep learning in image quality evaluation. Background technique [0002] Nowadays, with the rapid development of Internet technology and communication technology, digital images have become an important way of information transmission in people's daily life. According to statistics, since 2011, the total number of digital photos produced in the world has reached tens of billions, and this number is still increasing year by year. However, images are susceptible to different kinds of distortions during the process of acquisition, storage, compression, and transmission, resulting in reduced image quality. Therefore, how to evaluate image quality accurately and reliably has become an important research hotspot in current and future research. Usually, most images are viewed by people, s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06T7/0002G06N3/08G06T2207/30168G06T2207/20081G06T2207/20084G06V10/60G06V10/462G06N3/045G06F18/2411G06F18/22
Inventor 颜成钢陈子阳张继勇孙垚棋张勇东
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products