Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Remote sensing target detection method based on sparse guidance and significant drive

A target detection and salience technology, which is applied in the direction of instruments, character and pattern recognition, scene recognition, etc., can solve the problems of complex calculation, category recognition of insignificant areas, and extraction of target areas, etc., to achieve comprehensive category information and lucid recognition effect great effect

Active Publication Date: 2016-03-09
BEIHANG UNIV
View PDF3 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In summary, traditional saliency detection methods are not ideal for remote sensing images with complex backgrounds and environmental interference.
Simply using the underlying features to calculate the saliency is simple but easy to lose the boundary information or internal information of the salient area; while using the task-based top-down method, the calculation is complex and time-consuming
At present, the partial saliency model learns the high-level features based on the low-level features to calculate the saliency of the image, which is greatly improved compared with the previous saliency map based on the low-level features, but it is still not possible for remote sensing images with complex backgrounds. Extract the target area comprehensively and accurately
At the same time, the saliency detection method only extracts the salient areas in the remote sensing image, and cannot identify the category of the salient areas. When multiple salient areas are detected in the image, it is difficult to distinguish the salient areas by simply using the saliency detection method. category information for

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote sensing target detection method based on sparse guidance and significant drive
  • Remote sensing target detection method based on sparse guidance and significant drive
  • Remote sensing target detection method based on sparse guidance and significant drive

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] see figure 1 As shown, the present invention is directed to a remote sensing target detection method based on sparse guidance and saliency driving, and its specific implementation steps are as follows:

[0027] Step 1: dividing the input remote sensing image into several sub-blocks, extracting the color features of the global sub-blocks, clustering to form a global dictionary, and simultaneously extracting the background color features of the sub-blocks at the image boundary, and clustering to form a background dictionary;

[0028] (1) Construct a global dictionary

[0029] For the input remote sensing image, it is divided into T sub-blocks, T is an integer greater than 1, and the color features of the LAB color space are used to represent the sub-blocks to form a sub-block matrix. For the global set, the first The matrix formed by the expansion of color features in i sub-blocks is defined as G lab (i), 1≤i≤T, wherein, G is the name of the matrix, lab is the name of t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a remote sensing target detection method based on sparse guidance and significant drive. The method comprises the steps of (1) dividing an input remote sensing image into sub blocks, extracting global color feature clusters to form a global dictionary and extracting the color feature clusters of the image edge sub blocks to form a background dictionary, (2) carrying out sparse expression on all input image sub blocks by using the global dictionary and the background dictionary and obtaining global and background sparse expression coefficients, (3) clustering the sparse expression coefficients obtained in the step (2) to generate global and background significant maps, (4) carrying out smooth denoising on the global and background significant maps obtained in the step (3) and then obtaining a final significant map by using Bayesian fusion, and obtaining a significant target area, (5) extracting a color feature and a texture feature for the significant target area detected in the step (4) and a collected training sample, and carrying out sparse expression by using a maximum value restraint sparse coding model, and (6) carrying out target type identification on the significant target area by using a sparse expression coefficient obtained in the step (5). According to the method, an interested remote sensing target can be accurately and rapidly detected and identified in a complex background, and the effect is outstanding.

Description

technical field [0001] The invention belongs to the application field of computer vision and image processing, and relates to a sparse-driven saliency detection method for remote sensing images and a target recognition method for a maximum-constrained sparse model. This method first uses sparse representation to mine the category information between image sub-blocks, extracts salient areas of remote sensing images, these salient areas contain potential targets, narrows down the range of candidate areas for subsequent target detection or recognition, and then uses the maximum The constrained sparse coding model recognizes the categories of the detected salient areas, which not only can quickly locate the salient targets in the remote sensing image, but also further distinguish the target category, which not only reduces the algorithm target search time, but also improves the accuracy of remote sensing targets. detection performance. Background technique [0002] The target d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V20/13G06V10/462G06F18/251
Inventor 赵丹培王佳佳马媛媛张杰姜志国
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products