Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sparse reconstruction-based KA-STAP (Knowledge Assistance-Space-Time Adaptive Processing) clutter and noise covariance matrix high-accuracy estimation method

A KA-STAP, covariance matrix technology, applied in radio wave measurement systems, instruments, etc., can solve problems such as STAP performance degradation, and achieve the effect of improving clutter suppression and target detection performance, small computational load, and easy engineering implementation.

Active Publication Date: 2019-01-15
HOHAI UNIV
View PDF2 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem of STAP performance degradation caused by short-range clutter non-stationarity of airborne non-side-view array radar, the present invention proposes a high-precision estimation method of KA-STAP noise covariance matrix based on sparse reconstruction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sparse reconstruction-based KA-STAP (Knowledge Assistance-Space-Time Adaptive Processing) clutter and noise covariance matrix high-accuracy estimation method
  • Sparse reconstruction-based KA-STAP (Knowledge Assistance-Space-Time Adaptive Processing) clutter and noise covariance matrix high-accuracy estimation method
  • Sparse reconstruction-based KA-STAP (Knowledge Assistance-Space-Time Adaptive Processing) clutter and noise covariance matrix high-accuracy estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] Below in conjunction with accompanying drawing, technical scheme of the present invention will be further described:

[0059] Such as figure 1 As shown, a KA-STAP noise covariance matrix high-precision estimation method based on sparse reconstruction, specifically includes the following steps:

[0060] S1, analyze airborne radar echo data, and use sparse reconstruction to obtain high-resolution two-dimensional space-time spectrum;

[0061] S2, screening the pixels on the two-dimensional space-time spectrum, and calculating the weighted value corresponding to the pixels;

[0062] S3, using the weighted least squares method to fit the clutter trajectory;

[0063] S4, estimate the noise power according to the sparsely reconstructed space-time spectrum, and construct a priori clutter plus noise covariance matrix;

[0064] S5, using the prior noise covariance matrix and dimensionality reduction STAP to perform adaptive filtering and target detection on the detection unit....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a sparse reconstruction-based KA-STAP (Knowledge Assistance-Space-Time Adaptive Processing) clutter and noise covariance matrix high-accuracy estimation method, which comprisesthe following steps: S1, analyzing airborne radar echo data, and acquiring a high-resolution two-dimensional space-time spectrum by using sparse reconstruction; S2, screening pixel points on the two-dimensional space-time spectrum, and calculating weighted values of corresponding to the pixel points; S3, fitting a clutter track by using a weighted least square method; S4, estimating the noise power according to a sparse reconstruction space-time spectrum, and constructing a prior clutter and noise covariance matrix; and S5, carrying out adaptive filtering and target detection on a detection unit by using the prior clutter and noise covariance matrix and dimension-reduced STAP. Compared with a conventional STAP algorithm, the method disclosed by the invention is relatively small in calculating volume, can effectively improve the clutter suppression and target detection performance of the STAP system under a non-steady clutter environment, and is favorable for engineering implementation.

Description

technical field [0001] The invention belongs to the technical field of knowledge-assisted space-time adaptive processing (KA-STAP), in particular to a high-precision estimation method of KA-STAP noise covariance matrix based on sparse reconstruction. Background technique [0002] STAP (Space-Time Adaptive Processing, Space-Time Adaptive Processing) is an important technical means for current airborne radar to suppress ground clutter and realize ground moving target detection. The conventional STAP method is based on the assumption that the clutter samples in the adjacent range unit and the clutter in the unit to be detected satisfy statistical stationarity, and uses the maximum likelihood estimation of the clutter covariance matrix to solve the adaptive weight. In order to ensure that the output signal-to-noise ratio (SCNR) loss relative to the optimal STAP processing is limited within 3dB, the training samples used to estimate the clutter covariance need to meet the indepen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S7/41
CPCG01S7/41Y02A90/10
Inventor 沈明威张琪李建峰王冠汪晨辉
Owner HOHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products