Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

No-reference image quality evaluation method based on convolutional neural network

A technology of convolutional neural network and reference image, which is applied in the application field of generative confrontation network in image quality evaluation, can solve the problems of low performance and poor performance of no reference evaluation method, and achieve the effect of fast training speed and good experimental effect

Pending Publication Date: 2021-05-07
HANGZHOU DIANZI UNIV
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Most of the existing no-reference quality evaluation methods are evaluation methods with known subjective quality scores. Such methods usually require a large number of training sample images and corresponding subjective scores to train the quality prediction model. In contrast, subjective quality scores are unknown. There are still few reference-free evaluation methods and the performance of existing methods is still not comparable to methods with known subjective quality scores

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference image quality evaluation method based on convolutional neural network
  • No-reference image quality evaluation method based on convolutional neural network
  • No-reference image quality evaluation method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] The present invention will be further described below.

[0020] A no-reference image quality evaluation method based on convolutional neural network, the specific implementation steps are as follows:

[0021] Step 1: Preprocess the distorted image and the natural image to obtain a similar image;

[0022] 1-1. Calculate brightness contrast:

[0023] For distorted map: X, natural map: Y, use Indicates the brightness information of the distortion map X, Represents the brightness information of the natural graph Y:

[0024]

[0025] where x i ,y i is the pixel value of the image.

[0026] Then the brightness contrast between the distorted image X and the natural image Y can be expressed as:

[0027]

[0028] where C 1 It is an extremely small number set to prevent the denominator from being 0.

[0029] 1-2. Calculate the contrast ratio: C(x,y)

[0030] Using σ x Indicates the contrast information of the distortion map X, σ y Represents the contrast inf...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a no-reference image quality evaluation method based on a convolutional neural network. The method comprises the steps: preprocessing a distortion graph and a natural graph to obtain a similar graph, and then constructing a neural network according to the distortion graph and the similar graph; on the basis of the confrontation generation concept of a GAN framework, in a generation network part, integrating the jump connection characteristic of a U-net framework and the denseblock structure characteristic of a densenet framework; in the network discrimination part, adopting a simple classification network; and finally training the constructed neural network. According to the method, the characteristics of the GAN network, the U-net network and the densenet network are respectively absorbed and combined, a more effective neural network is constructed, conversion and migration from graphs to graphs are more effectively realized, a better result is obtained in graph-to-graph realization, and the simulated quality score and the real quality score have strong correlation and smaller error.

Description

technical field [0001] The invention belongs to the field of image processing, designs an image quality evaluation method, and relates to the application of a generation confrontation network in deep learning in image quality evaluation. Background technique [0002] Nowadays, with the rapid development of Internet technology and communication technology, digital images have become an important way of information transmission in people's daily life. According to statistics, since 2011, the total number of digital photos produced in the world has reached tens of billions, and this number is still increasing year by year. However, images are susceptible to different kinds of distortions during the process of acquisition, storage, compression, and transmission, resulting in reduced image quality. Therefore, how to evaluate image quality accurately and reliably has become an important research hotspot in current and future research. Usually, most images are viewed by people, s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06T7/0002G06N3/08G06T2207/30168G06T2207/20081G06T2207/20084G06V10/60G06V10/462G06N3/045G06F18/2411G06F18/22
Inventor 颜成钢陈子阳张继勇孙垚棋张勇东
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products