Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hyperspectral target detection method based on L1 regular constraint depth multi-example learning

A multi-instance learning and target detection technology, applied in the field of image processing, can solve the problems of large memory space, long solution process, and long time-consuming SVM classifier, so as to enhance generalization ability, avoid over-fitting, and improve detection effect Effect

Pending Publication Date: 2021-02-12
XIDIAN UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this method is that when the training data is unbalanced, it is difficult for the standard SVM classifier to obtain a good classification effect, and when the amount of data is large, the solution process is long
The disadvantage of this method is that when the amount of data is large, the memory space required for the feature map is huge, and it takes a long time to train the SVM classifier.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hyperspectral target detection method based on L1 regular constraint depth multi-example learning
  • Hyperspectral target detection method based on L1 regular constraint depth multi-example learning
  • Hyperspectral target detection method based on L1 regular constraint depth multi-example learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The embodiments and effects of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0038] refer to figure 1 , the implementation steps of the present invention are as follows:

[0039] Step 1. Construct multi-instance learning data.

[0040] (1.1) Input a hyperspectral image, and use 80% of the images as a training set, and the remaining 20% ​​of the images as a test set;

[0041] The data used in this example includes a total of 5 hyperspectral images, of which the first, second, third and fourth images are used as training sets, and the fifth image is used as a test set;

[0042] (1.2) Define each pixel in the hyperspectral image as an example, K examples form a package, divide the training set images into N packages, and define the package containing the target example as a positive package, and the package that does not contain the target example A bag is defined as a negative bag;

[0043] In this exa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a hyperspectral target detection method based on L1 regular constraint depth multi-example learning, and mainly solves the problem of poor hyperspectral target detection effectin a complex scene in the prior art. The method comprises the following steps: 1) dividing an input image to obtain a training set and a test set; 2) constructing a deep multi-instance learning network W based on L1 regular constraint; 3) performing iterative training on the network W by using the training set, obtaining a preliminarily trained network Wi when the constructed loss function reaches a minimum value, and inputting the training set into the preliminarily trained network Wi again to perform iterative training to a set maximum number of iterations to obtain a finally trained network W '; and 4) inputting each example point of the test set into the finally trained network W' for detection to obtain a detection result. According to the method, the detection result of the hyperspectral target which is inaccurately marked is improved, the over-fitting phenomenon is reduced, and the method can be used for fine classification of explosives and crops.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a hyperspectral target detection method, which can be used for fingerprint detection in agriculture, explosive detection and crime scenes. Background technique [0002] Hyperspectral image HSI has a rapid growth in the application of remote sensing, and data acquisition is performed by a variety of hyperspectral sensors. Hyperspectral data contains hundreds of continuous spectral bands, which have better analysis capabilities than general images. It acquires spectral curves in many adjacent continuous and very narrow spectral bands, making it possible to construct The essentially continuous radiation spectrum, the sensor simultaneously captures the spatial information of the ground material, forming a unique three-dimensional cube data form. In hyperspectral data, each ground material can be suitably represented by a single spectral curve, called a "spectral...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/194G06V20/13G06N3/045G06F18/214
Inventor 缑水平白苑宁焦昶哲任子豪逯皓帆王秀秀牟金明任海洋
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products