Unlock instant, AI-driven research and patent intelligence for your innovation.

Image background clutter measurement method based on intrinsic derivation mechanism

A background clutter and measurement method technology, applied in the field of image processing, can solve the problems of inaccurate representation and description of complex clutter images, incomparability of clutter quantization results, and unstable calculation results.

Active Publication Date: 2020-05-22
中国人民解放军63891部队
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Schmieder et al. proposed the statistical variance model SV, which is simple to calculate and easy to implement, but its representation and description of complex clutter images is not accurate enough; Biberman et al. proposed the edge probability clutter measure POE based on the sensitivity of the human eye to image edge features. The number of edge points in the image is used to measure the strength of clutter; however, this method is transitionally dependent on the image edge detection threshold, and there is no fixed standard for its selection, and the clutter quantification results obtained by different users are not comparable; Chang et al. The human eye is highly adaptive to image structure features, and measures clutter by calculating the similarity between the background and the target in terms of brightness, contrast and structure, and proposes a target structure similarity clutter measure TSSIM; this method uses The stability constant of the denominator is highly dependent, which makes the clutter calculation results unstable; Xu Dejiang et al. calculated the clutter based on the high adaptability of the human eye to the image structure and the image structure similarity measure SSIM, which is widely used in the image quality evaluation field. Structural similarity with the target, and using the principle of visual salience to weight the structural similarity, proposed the image structure difference clutter measure VSD; Xiao Chuanmin et al. considered that the human eye is sensitive to the image structure, and used the direction gradient histogram to represent the target structure feature, the Bhattachary coefficient is used to measure the difference between the background clutter and the target in the direction gradient histogram space, and the background clutter measurement method ESSIM that introduces the gradient distribution feature is proposed; this method pays more attention to the gradient structure characteristics of the target, and weakens the target The brightness information of
[0005] To sum up, the background clutter measurement methods based on human visual characteristics in the prior art do not make full use of the human visual perception model, and there are inaccurate descriptions of clutter representations, unstable calculation results, and dependence on specific types of images. problems such as large variability and the independence of each method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image background clutter measurement method based on intrinsic derivation mechanism
  • Image background clutter measurement method based on intrinsic derivation mechanism
  • Image background clutter measurement method based on intrinsic derivation mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] The technical solution of the present invention will be described in further detail below in conjunction with the accompanying drawings and embodiments.

[0055] Such as figure 1 As shown, an image background clutter measurement method based on the intrinsic derivation mechanism includes the following specific steps:

[0056] S1. Input the original image, and extract the target area; extract the target area T from the input image, and set its size as m×n;

[0057] The background image is divided into blocks, in the input original image, the background image is divided into N small background units C i , each background small unit is equal to the size of the target area in the horizontal and vertical directions, and there is no overlap between the background small units in the horizontal and vertical directions;

[0058] Using the autoregressive model to simulate the internal derivation mechanism of the brain, the target area T and the background small unit C i Decomp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image background clutter measurement method based on an intrinsic derivation mechanism. The method comprises the following steps: employing an autoregression model to simulate the intrinsic derivation mechanism of the brain, and decomposing a target image into an ordered part and a disordered part; for the ordered part, measuring the difference between the ordered part corresponding to the target image and a background clutter image by utilizing structural similarity; for the disordered part, analyzing the uncertainty of the structure of the disordered part by adopting a brightness adaptive local binary pattern, calculating the uncertainty of the structure by utilizing ILBP entropy, and then measuring the difference between the disordered part corresponding to the target image and the background clutter image in an ILBP entropy space; and combining the ordered part clutter measurement and the unordered part clutter measurement by adopting a standard varianceweighted combination strategy to obtain a final image background clutter measurement value. The target acquisition performance predicted by the method is highly consistent with the actual subjective target acquisition performance data, and is superior to the existing clutter measurement method in the aspects of correlation and root-mean-square error.

Description

technical field [0001] The present invention relates to the technical field of image processing, and relates to a method for measuring background clutter of visible light images, in particular to a method for measuring background clutter of images based on an intrinsic derivation mechanism, which is mainly used for evaluating the impact of target acquisition performance and camouflage patterns in photoelectric imaging systems The design and evaluation of clutter, quantitative evaluation of the impact of clutter on target detection and recognition, etc. Background technique [0002] With the improvement of the detector technology level and the progress of the production process, the sensitivity and resolution of the photoelectric imaging system have been greatly improved, and have reached or approached the background limit, which makes the background factor an important factor limiting the target acquisition performance of the photoelectric imaging system. factor. Therefore,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06F17/18
CPCG06T7/0002G06F17/18Y02A90/10
Inventor 苗锡奎张恒伟王非康大勇康华超李武周侯兆飞刘小虎陈育斌
Owner 中国人民解放军63891部队
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More