Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Video image fusion performance evaluation method based on structure similarity and human vision

A technology of structural similarity and video images, applied in the field of image processing, can solve problems such as large differences in evaluation results, no consideration of human visual perception characteristics, and susceptibility to noise

Inactive Publication Date: 2011-11-02
XIDIAN UNIV
View PDF8 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this metric is a performance evaluation metric based on gradient information and is susceptible to noise
At the same time, this index did not consider the visual perception characteristics of the human eye during the design process, and the human eye is often the final receiver of video image fusion, resulting in a large difference between the evaluation results and the subjective evaluation results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Video image fusion performance evaluation method based on structure similarity and human vision
  • Video image fusion performance evaluation method based on structure similarity and human vision
  • Video image fusion performance evaluation method based on structure similarity and human vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The present invention will be described in further detail below with reference to the accompanying drawings.

[0039] refer to figure 1 , taking two reference input videos Va, Vb and fused video Vf as an example, the implementation steps are:

[0040] In the first step, each frame image of each input video and fused video is taken as the processing object, and the single-frame spatial performance evaluation index is calculated.

[0041] The following takes the tth frame image of each video image as an example to illustrate:

[0042] (1.1) Define a local window w at the spatial point (m, n) for the fused video image Vf and the input video image Va and Vb of the t-th frame image m,n,t , a window of size 7×7 is used in the present invention;

[0043] (1.2) Calculate the fusion between the fused video image Vf and the input video image Va, Vb in the current window w m,n,t The local structure similarity value under SSIM(Va, Vf|w m,n,t ) and SSIM(Vb, Vf|w m,n,t ):

[0...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a video image fusion performance evaluation method based on structure similarity and human vision, wherein the method is mainly used for solving the problem that the evaluation result obtained by the prior art does not accord with the subjective evaluation result. The method is implemented through the following steps of: constructing a space performance evaluation index according to the structure similarity between each frame of image of a fused video and each frame of image of an input video; constructing a time performance evaluation index according to the structure similarity between each frame of difference image of the fused video and each frame of difference image of the input video; combining the space performance evaluation index and the time performance evaluation index to obtain a space-time performance evaluation index; and setting parameters required for the index by imputing video image space contrast and time motion information on the basis of human vision perception features. The video image fusion performance evaluation method has the characteristics of accurate evaluation result and accordance with human vision subjective evaluation and canbe used for evaluating the performance of a video image fusion algorithm.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a video fusion performance evaluation method, which is used to comprehensively evaluate the performance of various fusion algorithms from two aspects of spatial information extraction and time consistency and stability. technical background [0002] Image fusion technology has been widely used in machine vision, digital cameras, target recognition and other fields. However, most of the current image fusion metrics are designed for static image fusion processing, while there are few studies on multi-sensor video image fusion. In practical applications such as security surveillance and target detection and recognition in battlefield environments, video images from multiple sensors often need to be fused. Video image fusion not only needs to meet the basic requirements of general image fusion in terms of spatial performance, that is, after fusion, each frame of video image should re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N17/00
Inventor 张强陈闵利王龙
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products