Check patentability & draft patents in minutes with Patsnap Eureka AI!

Deep learning target tracking method based on feature map segmentation and adaptive fusion

A target tracking and deep learning technology, applied in neural learning methods, image analysis, image enhancement, etc., can solve the problems of tracking performance degradation, tracking target loss, etc.

Pending Publication Date: 2021-02-05
NORTHEASTERN UNIV
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Compared with traditional tracking methods, deep learning-based methods have greater advantages in tracking accuracy. However, when the target is occluded or deformed, it is easy to cause the tracking target to be lost, resulting in a decrease in overall tracking performance.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep learning target tracking method based on feature map segmentation and adaptive fusion
  • Deep learning target tracking method based on feature map segmentation and adaptive fusion
  • Deep learning target tracking method based on feature map segmentation and adaptive fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0127] In order to make the purpose, technical solution and advantages of the present invention clearer, the technical solution of the present invention will be further described in detail below in conjunction with the drawings and embodiments.

[0128] Such as figure 1 As shown, the deep learning target tracking method based on feature map segmentation and adaptive fusion of the present invention includes:

[0129] (1) Preprocessing the video in the training set to generate a training sample pair consisting of a template image and a search area image; generating a response map label;

[0130] (2) Construct a deep learning network model based on feature map segmentation and adaptive fusion for target tracking; the model consists of a twin template and search area feature extractor, a template feature map segmenter, and a center feature map segmentation unit reconstructor , connection response map generator, adaptive fusion weight generator and adaptive fuser, such as figure...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A deep learning target tracking method based on feature map segmentation and adaptive fusion comprises the following steps: (1) preprocessing a video in a training set to generate a training sample pair composed of a template image and a search area image; generating a response graph label; (2) constructing a deep learning network model based on feature map segmentation and adaptive fusion; (3) training the deep learning network model, and performing disk storage on the structure of the model and trained model parameters to obtain a target tracking model; (4) processing the video to be trackedto obtain a template image corresponding to the first frame and search area images corresponding to three scales corresponding to each subsequent frame to be tracked; and (5) loading the target tracking model to form three pairs of samples composed of template images and search area images, and inputting the three pairs of samples into the target tracking model to obtain the target position of each subsequent frame of the to-be-tracked video.

Description

technical field [0001] The invention belongs to the field of artificial intelligence, and in particular relates to a deep learning target tracking method based on feature map segmentation and adaptive fusion. Background technique [0002] Target tracking means that for a given video and the tracking target in its first frame, the position of the target is continuously located in subsequent frames. Target tracking has a wide range of applications in many fields such as security video surveillance, unmanned aerial vehicle reconnaissance, military target tracking, military strikes, patient supervision, and intelligent transportation. Traditional target tracking methods include target tracking methods based on mean shift, target tracking methods based on particle filter, target tracking methods based on sparse coding and target tracking methods based on correlation filtering. With the continuous development of artificial intelligence technology and the success of deep learning ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06T7/11G06N3/04G06N3/08
CPCG06T7/246G06T7/11G06N3/084G06T2207/10016G06T2207/20221G06N3/045
Inventor 林树宽李川皓乔建忠涂悦
Owner NORTHEASTERN UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More