Online multi-target tracking method of unified target motion perception and re-identification network

A multi-target tracking and motion-aware technology, applied in the field of online multi-target tracking, can solve the problems of frequent conversion of identity recognition and inability to associate, and achieve the effect of adding re-identification branches, improving performance, and enhancing anti-occlusion capabilities.

Active Publication Date: 2021-08-27
XIAMEN UNIV
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Such methods cannot associate interrupted trajectories, resulting in frequent identification changes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Online multi-target tracking method of unified target motion perception and re-identification network
  • Online multi-target tracking method of unified target motion perception and re-identification network
  • Online multi-target tracking method of unified target motion perception and re-identification network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The following examples will further illustrate the present invention in conjunction with the accompanying drawings. The present embodiment is implemented on the premise of the technical solution of the present invention, and provides implementation and specific operation process, but the protection scope of the present invention is not limited to the following implementation example.

[0040] see figure 1 , the implementation of the embodiment of the present invention includes the following steps:

[0041] A. Input the current frame image and the previous frame image to the backbone network to obtain the feature maps of the two frame images.

[0042] Such as figure 2 As shown, the backbone network is transformed with the DLA-34 network; the DLA-34 network is composed of an iterative depth aggregation module and a hierarchical depth aggregation module; all ordinary convolutional layers in the upsampling module of the DLA-34 network are replaced with deformable convolu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an online multi-target tracking method for a unified target motion perception and re-identification network, and relates to a computer vision technology. The method comprises the following steps: A, inputting a current frame image and a previous frame image into a backbone network, and obtaining feature maps of the two frames of images; B, sending the thermodynamic diagram of the previous image frame and the two feature maps in step A to a detection and tracking branch, and calculating the detection result of the current frame and the target tracking offset; C, sending the feature maps of the two frames of imagein the step A to a re-identification branch, obtaining embedded feature vectors, and storing the embedded feature vectors in a re-identification feature vector pool; D, performing first matching on the detection result obtained in step B according to the tracking offset, and distributing identity recognition of a corresponding target for the matched detection; and E, carrying out secondary matching on the unmatched detection result obtained in step D, carrying out similarity calculation on the unmatched detection result and the embedded feature vector obtained in the step C one by one, and distributing identity recognition for different detection results according to a preset threshold value to obtain a final tracking result of the current frame.

Description

technical field [0001] The invention relates to computer vision technology, in particular to an online multi-target tracking method that unifies target motion perception and re-identification networks. Background technique [0002] The visual system is a very important way for humans to obtain external information. It can provide humans with rich resource information. As a basic task in the field of computer vision, multi-target tracking aims to estimate the trajectory of a specific category in a sequence. In recent years, research on multi-target tracking algorithms has received more and more attention. However, in dense crowds or low frame rate videos, the targets are prone to large motion offsets, mutual occlusion, overlapping, etc., resulting in limited tracking performance. Therefore, it is of great significance to study the target motion information and construct a simple and effective re-identification network for online multi-target tracking methods. [0003] The e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/62G06T7/66G06T7/73G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06T7/248G06T7/74G06T7/66G06T7/62G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06T2207/30196G06V40/103G06V10/44G06V10/751G06N3/048G06N3/045G06F18/241
Inventor 王菡子王英
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products