Background clutter quantizing method based on contrast ratio function

A technology of sensitive function and background clutter, applied in the field of image processing, can solve problems such as violation of the physical essence of background clutter, difficulty in predicting and evaluating the external field performance of photoelectric imaging systems, and inability to reasonably reflect the impact, etc., and achieve accurate prediction. Effect

Inactive Publication Date: 2013-04-17
XIDIAN UNIV
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the statistical variance scale SV is based on the statistical processing of photoelectric images, without considering the visual perception characteristics of the human eye; the edge probability scale POE only uses background information as a reference, which violates the physical essence of background clutter relative to the target
This makes the clutter scale established by SV and POE unable to reasonably reflect the influence of the background on the target acquisition process, and it is difficult to accurately predict and evaluate the field performance of the optoelectronic imaging system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Background clutter quantizing method based on contrast ratio function
  • Background clutter quantizing method based on contrast ratio function
  • Background clutter quantizing method based on contrast ratio function

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] refer to figure 1 , the present invention is based on the background clutter quantification method of the contrast sensitive function and realizes the steps as follows:

[0026] Step 1. Normalize and nonlinearize the target brightness image and background brightness image to obtain target brightness matrix x and background brightness matrix y respectively. The specific steps are as follows:

[0027] (1a) Use the brightness value L at each position of the target brightness image t divided by its mean Realize the normalization of the target brightness image, and then take the cube root of the obtained value to realize nonlinearization, and then obtain the target brightness matrix x:

[0028] x = L t / L ‾ t 3 ;

[0029] (1b) Use the brightness ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a background clutter quantizing method based on a contrast ratio function. By using the background clutter quantizing method, the problems that the scale of the traditional clutter cannot well accord with the relativity of the background clutter, and the human vision identity cannot be fully reflected are mainly solved. The background clutter quantizing method comprises the steps of: carrying out uniformization and nonlinear processing on a background and a target brightness image to obtain a background and target brightness matrix; segmenting the background and targetbrightness matrix into a plurality of small units of the same size; and carrying out Fourier transformation on the target brightness matrix and the small background brightness units, and filtering the obtained frequency domain target and the background matrixes according to the contrast ratio function to obtain a target and background sensing matrix; and using the mean value of difference quadratic sums of the target sensing matrix and all the background sensing matrixes as the overall background clutter scale. By using the background clutter quantizing method disclosed by the invention, the consistency between the detection probability of the predicted target and the detection probability of the subjective actual target is improved, and the accuracy of prediction and estimation of targetobtaining performance of a photoelectronic imaging system is improved.

Description

technical field [0001] The invention belongs to the technical field of image processing, especially the background clutter quantification method with the help of contrast sensitive function, which conforms to the spatial frequency characteristics of the human eye, not only can be used for the prediction and evaluation of the target acquisition performance of the photoelectric imaging system, but also can guide the military camouflage scheme and image Design and improvement of processing algorithms. Background technique [0002] The characterization of target background characteristics is an important research topic in the field of optoelectronic imaging systems, and it is the basis for establishing accurate and reasonable target acquisition performance models of imaging systems and accurately predicting the field test performance of imaging systems. Especially with the introduction of new materials and the progress of manufacturing technology, while greatly improving the res...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/62G06T7/00
Inventor 杨翠毛维李倩张建奇
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products