Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Saturation based quality evaluation method for colored multi-exposure fusion image

A technology that integrates images and quality evaluation. It is used in image enhancement, image analysis, and image data processing. It can solve problems such as information loss, ignoring background information, ignoring image texture information and human eye perception characteristics, and achieves a good degree of preservation. Effect

Inactive Publication Date: 2017-06-23
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF4 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, the quality evaluation algorithms for fusion images are mainly divided into four categories: 1) methods based on image information, this type of method mainly uses the mutual information of images before and after fusion to evaluate the quality of the image after fusion, this type of method only considers the overall information of the image 2) Image feature-based methods, which mainly use edge features in the spatial domain and edge features in the wavelet domain to evaluate the quality of the fused image, which ignore the image 3) Based on the structure similarity method, this type of method is inspired by the SSIM algorithm, which calculates the structural similarity between the image before fusion and the image after fusion, which is similar to human vision, but now Most of the evaluation algorithms based on structural similarity are based on grayscale images, ignoring the color information of the image. In reality, the multi-exposure fusion images are all color images, and the color information changes after fusion, so there are existing evaluation algorithms. It is not suitable for color images; 4) Based on the method of human eye perception, this type of method mainly uses the salient information of the image to evaluate the image, but often ignores the background information of the image, resulting in the loss of information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Saturation based quality evaluation method for colored multi-exposure fusion image
  • Saturation based quality evaluation method for colored multi-exposure fusion image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be described in detail below with reference to the accompanying drawings and examples.

[0028] The invention provides a method for evaluating the quality of a color multi-exposure fusion image based on saturation, the specific process is as follows figure 1 shown.

[0029] Step 1. Using the multi-exposure images and their fusion images in MEF (Multi-exposure Image Fusion Database) as training samples, the multi-exposure images and fusion images are extracted based on saturation and wavelet coefficients to obtain texture information, structure information and Color information; according to the texture information, structure information and color information of the images before and after fusion, the texture similarity, structure similarity and color similarity are calculated respectively; the texture information similarity, structure information similarity and color information similarity are used as feature values, Combined with the given e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a saturation based quality evaluation method for a colored multi-exposure fusion image, and the method can be used to solve the problem that evaluation for the colored multi-exposure fusion image is inaccurate. A multi-exposure image in an MEF database and a fusion image of the multi-exposure image serve as a training sample, and texture, structure and color information is obtained from the training sample by utilizing a saturation and wavelet coefficient based extraction manner; texture, structure and color similarities are calculated respectively; all the obtained similarities, as characteristic values, together a given MOS value are input to an ELM machine learning device to implement training; and the texture, structure and color similarities of the multi-exposure fusion image and the corresponding generated fused image are input to the trained ELM machine learning device, and an evaluation result is obtained.

Description

technical field [0001] The invention belongs to the field of image quality evaluation methods, in particular to a saturation-based color multi-exposure fusion image quality evaluation method. Background technique [0002] The current ordinary display devices can display a brightness range that is far smaller than the real scene and the brightness range that the human eye can perceive, so it is impossible to display the real scene observed by the human eye on an ordinary device. The high dynamic range (High Dynamic Range, HDR) image display technology can display as much light and dark information of the real scene on common devices as possible, and conform to the real perception effect of the human eye as much as possible. With the rapid development of the high-definition digital industry, HDR display technology has become one of the current research hotspots in the digital field. Multi-exposure image fusion technology is a simple HDR display technology, which can directly ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00G06N99/00
CPCG06T7/0002G06N20/00G06T2207/20064G06T2207/30168
Inventor 赵保军李震王水根韩煜祺邓宸伟
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products