Remote sensing target detection method based on sparse guidance and significant drive

A target detection and salience technology, which is applied in the direction of instruments, character and pattern recognition, scene recognition, etc., can solve the problems of complex calculation, category recognition of insignificant areas, and extraction of target areas, etc., to achieve comprehensive category information and lucid recognition effect great effect

Active Publication Date: 2016-03-09
BEIHANG UNIV
View PDF3 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In summary, traditional saliency detection methods are not ideal for remote sensing images with complex backgrounds and environmental interference.
Simply using the underlying features to calculate the saliency is simple but easy to lose the boundary information or internal information of the salient area; while using the task-based top-down method, the calculation is complex and time-consuming
At present, the partial saliency model learns the high-level features based on the low-level features to calculate the saliency of the image, which is greatl

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing target detection method based on sparse guidance and significant drive
  • Remote sensing target detection method based on sparse guidance and significant drive
  • Remote sensing target detection method based on sparse guidance and significant drive

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0026] See figure 1 As shown, the present invention is directed to a remote sensing target detection method based on sparse guidance and salient driving. The specific implementation steps are as follows:

[0027] Step 1: Divide the input remote sensing image into several sub-blocks, extract the color features of the global sub-blocks, cluster to form a global dictionary, and extract the background color features of the sub-blocks at the image boundary, and cluster to form a background dictionary;

[0028] (1) Construct a global dictionary

[0029] For the input remote sensing image, divide it into T sub-blocks, where T is an integer greater than 1, and use the color characteristics of the LAB color space to represent the sub-blocks to form a sub-block matrix. For the global set, the first The matrix formed by the expansion of color features in i sub-blocks is defined as G lab (i), 1≤i≤T, where G is the name of the matrix, lab is the name of the color space, i is a certain sub-block,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a remote sensing target detection method based on sparse guidance and significant drive. The method comprises the steps of (1) dividing an input remote sensing image into sub blocks, extracting global color feature clusters to form a global dictionary and extracting the color feature clusters of the image edge sub blocks to form a background dictionary, (2) carrying out sparse expression on all input image sub blocks by using the global dictionary and the background dictionary and obtaining global and background sparse expression coefficients, (3) clustering the sparse expression coefficients obtained in the step (2) to generate global and background significant maps, (4) carrying out smooth denoising on the global and background significant maps obtained in the step (3) and then obtaining a final significant map by using Bayesian fusion, and obtaining a significant target area, (5) extracting a color feature and a texture feature for the significant target area detected in the step (4) and a collected training sample, and carrying out sparse expression by using a maximum value restraint sparse coding model, and (6) carrying out target type identification on the significant target area by using a sparse expression coefficient obtained in the step (5). According to the method, an interested remote sensing target can be accurately and rapidly detected and identified in a complex background, and the effect is outstanding.

Description

technical field [0001] The invention belongs to the application field of computer vision and image processing, and relates to a sparse-driven saliency detection method for remote sensing images and a target recognition method for a maximum-constrained sparse model. This method first uses sparse representation to mine the category information between image sub-blocks, extracts salient areas of remote sensing images, these salient areas contain potential targets, narrows down the range of candidate areas for subsequent target detection or recognition, and then uses the maximum The constrained sparse coding model recognizes the categories of the detected salient areas, which not only can quickly locate the salient targets in the remote sensing image, but also further distinguish the target category, which not only reduces the algorithm target search time, but also improves the accuracy of remote sensing targets. detection performance. Background technique [0002] The target d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V20/13G06V10/462G06F18/251
Inventor 赵丹培王佳佳马媛媛张杰姜志国
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products