Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

No-reference image quality map generation method based on adversarial generative network

A map generation and reference image technology, which is applied in the field of image processing, can solve the problems of low performance and poor performance of evaluation methods without reference

Active Publication Date: 2020-05-08
HANGZHOU DIANZI UNIV
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Most of the existing no-reference quality evaluation methods are evaluation methods with known subjective quality scores. Such methods usually require a large number of training sample images and corresponding subjective scores to train the quality prediction model. In contrast, subjective quality scores are unknown. There are still few reference-free evaluation methods and the performance of existing methods is still not comparable to methods with known subjective quality scores

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • No-reference image quality map generation method based on adversarial generative network
  • No-reference image quality map generation method based on adversarial generative network
  • No-reference image quality map generation method based on adversarial generative network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0086] The present invention will be further described below.

[0087] Such as figure 1 As shown, a no-reference image quality map generation method based on the confrontational generative network, firstly, the image must be preprocessed to obtain the similarity map corresponding to the distortion map: SSIM_MAP and FSIM_MAP, and then a neural network framework based on the U-net network is trained , input the distortion map to the trained network, the similarity map of the distortion map can be obtained, and the corresponding quality score can be obtained through the similarity map. The specific implementation steps are as follows:

[0088] Step 1: Preprocessing to get a similar graph

[0089] 1-1. Calculate the similarity map SSIM_MAP

[0090] 1-1-1. Calculate brightness contrast:

[0091] Known distortion map X and natural image Y, use and Represents the brightness information of two images:

[0092]

[0093] where x i ,y i are the pixel values ​​of the know...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a no-reference image quality map generation method based on an adversarial generative network. According to the method, a U-net network framework with eight down-samples and eight up-samples is adopted in a generation network part; a classification network is adopted in the discrimination network part; a mode of adding L1 norm loss to the cross entropy of a discriminator isadopted in the loss function part; and finally, a generative network model is iteratively trained, a similar graph of the input distortion graph is acquired through the generative network model, anda corresponding quality score is obtained through the similar graph. The method has no reference quality evaluation. Quality evaluation is performed on the distorted image under the condition of no natural image by using the trained neural network framework. The problem of calculating the quality score of the similar graph with the weight problem is solved. Based on an adversarial generative network and U-net, graph-to-graph conversion and migration are realized more effectively. An experiment result has a good result in graph-to-graph implementation, and the simulated mass fraction and the real mass fraction have strong correlation and small errors.

Description

technical field [0001] The invention belongs to the field of image processing and provides a non-reference image quality map generation method based on an adversarial generation network. It involves image quality evaluation methods, and involves the application of generative confrontation networks in deep learning in image quality evaluation. Background technique [0002] Nowadays, with the rapid development of Internet technology and communication technology, digital images have become an important way of information transmission in people's daily life. According to statistics, since 2011, the total number of digital photos produced in the world has reached tens of billions, and this number is still increasing year by year. However, images are susceptible to different kinds of distortions during the process of acquisition, storage, compression, and transmission, resulting in reduced image quality. Therefore, how to evaluate image quality accurately and reliably has become...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T11/20G06T3/40G06T5/00G06K9/62G06N3/04G06N3/08
CPCG06T11/206G06T3/4038G06N3/08G06T2207/10004G06T2207/20081G06T2207/20084G06N3/045G06F18/22G06F18/2411G06T5/92Y02T10/40
Inventor 颜成钢陈子阳谷文玉孙垚棋张继勇张勇东
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products