Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Universal no-reference image quality evaluation method based on multi-task convolutional neural network

A convolutional neural network, reference image technology, applied in the field of general non-reference image quality evaluation, to achieve the effect of accurate image representation and good performance

Inactive Publication Date: 2019-08-30
ZHEJIANG UNIV
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, it requires a large amount of subjectively evaluated image data for training

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Universal no-reference image quality evaluation method based on multi-task convolutional neural network
  • Universal no-reference image quality evaluation method based on multi-task convolutional neural network
  • Universal no-reference image quality evaluation method based on multi-task convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, and do not limit the protection scope of the present invention.

[0042] Such as figure 1 As shown, the non-reference image quality evaluation method based on the multi-task convolutional neural network provided in this embodiment includes the following steps:

[0043] Step 1: Build training data

[0044] The data set used for training the multi-task neural network proposed by the present invention is a LIVE data set.

[0045] The LIVE dataset has 29 original images and 792 corresponding degraded images. The degraded categories include five categories: JPEG, JPEG2000, Noise, Blur and Fast Fading. Each contaminated image has a human-...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a no-reference image quality evaluation method based on a multi-task convolutional neural network, and belongs to the field of image perception. The method specifically comprises the following steps: step 1, extracting a plurality of image blocks with fixed sizes from each image of a manually labeled image quality data set, and each image block corresponding to two labels,which are respectively a degradation category of the image and a degradation degree of the image, so as to form a training set; step 2, constructing a convolutional neural network model based on dictionary learning; step 3, training the constructed convolutional neural network model by using the training set, and determining parameters of the convolutional neural network model after the training is finished; and step 4, during application, inputting the to-be-scored pollution image into the trained convolutional neural network model to obtain a corresponding image quality score. Compared witha traditional method, the method is higher in consistency with subjective evaluation in the field of non-reference image quality evaluation, and key indexes such as the Spearman rank correlation coefficient and the Pearson linear correlation coefficient are obviously improved.

Description

technical field [0001] The invention belongs to the fields of image perception and artificial intelligence, and in particular relates to a general non-reference image quality evaluation method based on a multi-task convolutional neural network. Background technique [0002] Images play an irreplaceable role in video communication, entertainment, and social networking. During the process of image acquisition, processing, transmission, and storage, images will inevitably be polluted by various noises, which will affect people's perception experience. Therefore, evaluating image quality is valuable for improving user experience. For example, a video provider can adjust image compression rate based on user feedback on image quality, so as to better utilize network transmission bandwidth, etc. Although human beings are the ultimate recipients of images, human subjective evaluation is the best option, but the process of subjective evaluation is time-consuming and labor-intensive, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06K9/62G06N3/04G06N3/08
CPCG06T7/0002G06N3/084G06T2207/30168G06N3/045G06F18/214
Inventor 陈耀武黄余格田翔蒋荣欣
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products