Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unsupervised hyperspectral video target tracking method based on spatial-spectral feature fusion

A feature fusion and target tracking technology, applied in the field of computer vision technology processing, can solve the problem of few training samples

Pending Publication Date: 2021-05-07
WUHAN UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The second is to design a correlation filter hyperspectral video target tracking framework for spatial spectral feature fusion, which solves the problem of few training samples in hyperspectral video to a certain extent, and at the same time, fusing RGB and hyperspectral features can obtain more robust and recognizable images. feature

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unsupervised hyperspectral video target tracking method based on spatial-spectral feature fusion
  • Unsupervised hyperspectral video target tracking method based on spatial-spectral feature fusion
  • Unsupervised hyperspectral video target tracking method based on spatial-spectral feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0069] An embodiment of the present invention provides an unsupervised hyperspectral video target tracking method based on spatial spectral feature fusion, comprising the following steps:

[0070] Step 1, video data preprocessing, this step further includes:

[0071] Step 1.1, convert the video data into a frame of continuous image X i (RGB video frame or hyperspectral video frame).

[0072] Step 1.2, unmarked video image frame X i All resize into 200×200 pixel size video image frame Y i .

[0073] Step 2, randomly initialize the bounding box (BBOX), this step further includes:

[0074] On the basis of step 1, in the unlabeled video frame Y i In the above, randomly select a 90×90 pixel area (a 90×90 pixel area centered on the coordinates [x, y]) as the target to be tracked (this area is the initialized BBOX). Resize the 90×90 area to a Z size of 125×125 pixels i . At the same time in Y i+1 to Y i+10 Randomly select two frames Y from these 10 frames i+a and Y i+b (1...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an unsupervised hyperspectral video target tracking method based on spatial-spectral feature fusion. The hyperspectral target tracking method based on deep learning is designed in combination with a cyclic consistency theory method, a hyperspectral target tracking deep learning model can be trained in an unsupervised manner, and the cost of manual labeling is saved. On the basis of a Siamese tracking framework, an RGB branch (space branch) and a hyperspectral branch are designed; RGB video data is used for training a space branch, a trained RGB model is loaded into network fixed parameters, meanwhile, a hyperspectral branch is trained, and the fused features with higher robustness and discrimination capability are obtained; and finally, the fused features are input into a correlation filter (DCF) to obtain a tracking result. According to the method, the problem of manual labeling of the hyperspectral video data and the problem of few hyperspectral training samples for deep learning model training can be solved, and the precision and speed of a hyperspectral video tracking model can be effectively improved.

Description

technical field [0001] The present invention is based on the field of computational vision technology processing, and in particular relates to an unsupervised hyperspectral video target tracking method based on spatial spectral feature fusion. Background technique [0002] Hyperspectral video (high spatial resolution-high temporal resolution-hyperspectral resolution) target tracking is an emerging direction, which aims to use the target information of a given initial frame in hyperspectral video to predict the state of the target in subsequent frames. Compared with RGB video target tracking, hyperspectral video target tracking can provide spectral information to distinguish different materials in addition to spatial information. Even if the target shape is the same, as long as the material is different, the hyperspectral video can be used to track the target, which is an advantage that RGB video target tracking does not have. Therefore, hyperspectral video target tracking c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/04G06N3/084G06V20/48G06V20/13G06F18/253Y02A40/10
Inventor 王心宇刘桢杞钟燕飞
Owner WUHAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products