A Remote Sensing Target Detection Method Based on Sparse Guidance and Salient Drive

A target detection and salience technology, which is applied in the direction of instruments, character and pattern recognition, scene recognition, etc., can solve the problems of complex calculation, category recognition of insignificant areas, and extraction of target areas, etc., to achieve comprehensive category information and lucid recognition effect great effect

Active Publication Date: 2018-07-17
BEIHANG UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In summary, traditional saliency detection methods are not ideal for remote sensing images with complex backgrounds and environmental interference.
Simply using the underlying features to calculate the saliency is simple but easy to lose the boundary information or internal information of the salient area; while using the task-based top-down method, the calculation is complex and time-consuming
At present, the partial saliency model learns the high-level features based on the low-level features to calculate the saliency of the image, which is greatly improved compared with the previous saliency map based on the low-level features, but it is still not possible for remote sensing images with complex backgrounds. Extract the target area comprehensively and accurately
At the same time, the saliency detection method only extracts the salient areas in the remote sensing image, and cannot identify the category of the salient areas. When multiple salient areas are detected in the image, it is difficult to distinguish the salient areas by simply using the saliency detection method. category information for

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Remote Sensing Target Detection Method Based on Sparse Guidance and Salient Drive
  • A Remote Sensing Target Detection Method Based on Sparse Guidance and Salient Drive
  • A Remote Sensing Target Detection Method Based on Sparse Guidance and Salient Drive

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] see figure 1 As shown, the present invention is directed to a remote sensing target detection method based on sparse guidance and saliency driving, and its specific implementation steps are as follows:

[0027] Step 1: dividing the input remote sensing image into several sub-blocks, extracting the color features of the global sub-blocks, clustering to form a global dictionary, and simultaneously extracting the background color features of the sub-blocks at the image boundary, and clustering to form a background dictionary;

[0028] (1) Construct a global dictionary

[0029] For the input remote sensing image, it is divided into T sub-blocks, T is an integer greater than 1, and the color features of the LAB color space are used to represent the sub-blocks to form a sub-block matrix. For the global set, the first The matrix formed by the expansion of color features in i sub-blocks is defined as G lab (i), 1≤i≤T, wherein, G is the name of the matrix, lab is the name of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A remote sensing object detection method based on sparse guidance and saliency driving. The steps are: ① Divide the input remote sensing image into sub-blocks, extract global color feature clusters to form a global dictionary, and extract color feature clusters of image edge sub-blocks Form a background dictionary; ② use the global dictionary and the background dictionary to perform sparse representation on all the input image sub-blocks respectively, and obtain global and background sparse representation coefficients; ③ cluster the sparse representation coefficients obtained in step ② to generate global and background saliency maps ; ④ Smooth and denoise the global and background saliency maps described in step ③, and then use Bayesian fusion to obtain the final saliency map to obtain the salient target area; ⑤ The salient target area detected in step ④ and the collected training samples Extract color features and texture features, and perform sparse representation using the maximum-constrained sparse coding model; ⑥Use the sparse representation coefficients obtained in step ⑤ to identify the target category of the salient target region. The invention can accurately and quickly detect and identify interested remote sensing targets from complex backgrounds, and has outstanding effects.

Description

technical field [0001] The invention belongs to the application field of computer vision and image processing, and relates to a sparse-driven saliency detection method for remote sensing images and a target recognition method for a maximum-constrained sparse model. This method first uses sparse representation to mine the category information between image sub-blocks, extracts salient areas of remote sensing images, these salient areas contain potential targets, narrows down the range of candidate areas for subsequent target detection or recognition, and then uses the maximum The constrained sparse coding model recognizes the categories of the detected salient areas, which not only can quickly locate the salient targets in the remote sensing image, but also further distinguish the target category, which not only reduces the algorithm target search time, but also improves the accuracy of remote sensing targets. detection performance. Background technique [0002] The target d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V20/13G06V10/462G06F18/251
Inventor 赵丹培王佳佳马媛媛张杰姜志国
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products