Sparse representation-based background clutter quantification method

A technology of background clutter and quantification method, which is applied in the field of image processing, can solve the problems of not considering the characteristics of human visual perception, violating the physical essence of background clutter, and the scale of clutter cannot be reasonably reflected, so as to achieve consistent target detection probability, Predict accurate results

Inactive Publication Date: 2012-12-05
XIDIAN UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the statistical variance scale SV is based on the statistical processing of photoelectric images, without considering the visual perception characteristics of the human eye; the edge probability scale POE only uses background information as a reference, which violates the physical essence of background clutter relative to the target
This makes the clutter scale established by the two cannot reasonably reflect the influence of the background on the target acquisition process, and it is difficult to accurately predict and evaluate the field performance of the optoelectronic imaging system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sparse representation-based background clutter quantification method
  • Sparse representation-based background clutter quantification method
  • Sparse representation-based background clutter quantification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] Reference figure 1 The implementation steps of the background clutter quantization method based on sparse representation of the present invention are as follows:

[0029] Step 1. Take the pixel value of the target image as a unit, and form the target vector in the order of the initial column number from small to large

[0030] x={t 1,1 , T 2,1 , T 3,1 ,..., t C,1 , T 1,2 ,..., t C, D } T

[0031] Where t i, j Represents the target pixel value at (i, j), C and D represent the number of rows and columns of the target image, respectively, and T represents the transposition operation of the vector.

[0032] Step 2: Divide the background image to be quantized into N small units of equal size, and the size of each small unit in the horizontal direction and the vertical direction is twice the corresponding size of the target.

[0033] The size of N is determined by the size of the background image to be quantized A×B and the size of each small unit M=C×D, namely Among them, A and B ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a sparse representation-based background clutter quantification method, which mainly solves the problem that the conventional cluster scale cannot accord with the physical essence of background clutter relativity well and cannot fully reflect human vision property. The method comprises the following implementation steps of: partitioning a gray background image to be quantified into a plurality of small equal-size units and combining the small units into a background matrix; extracting main characteristics of a target vector and the background matrix so as to obtain a target characteristic vector and a background characteristic matrix; normalizing the target characteristic vector and the background characteristic matrix; and calculating the sparsest representations of the normalized target characteristic vector in the normalized background characteristic matrix, wherein the summation of absolute values of the sparsest representations is taken as the background clutter scale of the entire image. Two major characteristics during human eye search are fully utilized, the consistency between a predicted target detection probability and a subjective practical target detection probability is enhanced, and the method can be used for predicting and evaluating the target acquisition performance of a photoelectric imaging system.

Description

Technical field [0001] The invention belongs to the technical field of image processing, and in particular, a background clutter quantization method based on sparse representation can be used for the prediction and evaluation of the target acquisition performance of the photoelectric imaging system. Background technique [0002] The target acquisition performance of photoelectric imaging system is an important concept in military tasks such as photoelectric countermeasures, reconnaissance, early warning and camouflage. In recent years, with the introduction of new materials and advances in manufacturing processes, photodetectors have reached or approached the background limit, and background factors have become a key factor limiting the performance of photoelectric imaging systems. The use of reasonable and accurate background clutter quantization scales in the target acquisition performance characterization model of the photoelectric imaging system can make its prediction result...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/64
Inventor 杨翠李倩吴洁张建奇
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products