Adaptive anti-occlusion infrared target tracking method based on multi-layer depth feature fusion

A technology of depth features and infrared targets, which is applied in the field of video processing, can solve the problems of unrobust re-tracking, loss of tracking targets, and reduced stability of target tracking.

Active Publication Date: 2018-10-16
XIDIAN UNIV
View PDF8 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] The current target tracking method based on deep learning focuses on training a network model that can distinguish target and background information, and obviously suppresses non-target similar objects in the background; therefore, when the target is blocked by a complex scene for a long time, there is a phenomenon of tracking target loss. It reduces the stability of target tracking and is not robust to re-tracking after the target itself reappears

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive anti-occlusion infrared target tracking method based on multi-layer depth feature fusion
  • Adaptive anti-occlusion infrared target tracking method based on multi-layer depth feature fusion
  • Adaptive anti-occlusion infrared target tracking method based on multi-layer depth feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0067] Embodiments of the present invention provide an adaptive anti-occlusion target tracking method based on multi-layer depth feature fusion, such as figure 1 As shown, the method is specifically implemented through the following steps:

[0068] Step 101: Obtain multi-layer depth feature representations of video image target candidate regions.

[0069] With VGG-Net-19 deep convolutional network as the core network, multi-dimensional images are directly used as network input, avoiding complex feature extraction and data reconstruction processes.

[0070] VGG-Net-19 is mainly composed of 5 sets ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an adaptive anti-occlusion infrared target tracking method based on multi-layer depth feature fusion. Firstly, a series of multi-layer depth feature maps of the same size and different levels are obtained; the multi-layer depth feature map is then converted from a time domain to a frequency domain according to correlation filtering, a filter is trained and the response mapis calculated according to the fast Fourier transformation, and then the multi-layer depth feature map is merged and dimensionally reduced according to the weighted fusion of the intra-layer features,so that the feature response map of different levels is constructed and the maximum correlation response value is obtained, which is the estimated position of the target; at last, that dense featuresof the target are extracted, and the response confidence of the target center position estimated by the depth convolution feature is obtained according to the maximum response value of the feature obtained by the correlation filtering; when the response confidence of the target center position is less than the re-detection threshold T0, the obtained target estimated position is evaluated by on-line target re-detection and the position of the target is adaptively updated according to the evaluation result.

Description

technical field [0001] The invention belongs to the technical field of video processing, and in particular relates to an adaptive anti-occlusion infrared target tracking method for multi-layer depth feature fusion. Background technique [0002] Visual tracking is one of the research hotspots in the field of computer vision, and it is widely used in civilian fields such as video surveillance and intelligent transportation. In recent years, with the rapid development of the field of computer vision, the comprehensive performance of tracking algorithms has been significantly improved. At the same time, because the infrared imaging system uses the energy generated by the target to detect and identify the target by obtaining the energy information of the target, it has the ability of passive detection and all-day detection and is widely used in target perception equipment; among them, the Tracking the target of interest is the main task of the infrared detection system, so the t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06K9/00G06N3/04
CPCG06T7/246G06T2207/10048G06V20/42G06N3/045
Inventor 秦翰林王婉婷王春妹延翔程文雄彭昕胡壮壮周慧鑫
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products