Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Online multi-object tracking method based on unified object motion perception and re-identification network

A multi-target tracking and motion perception technology, applied in the field of online multi-target tracking, can solve the problems of frequent conversion of identity recognition and inability to associate, and achieve the effect of adding re-identification branches, improving performance, and enhancing anti-occlusion ability

Active Publication Date: 2022-05-17
XIAMEN UNIV
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Such methods cannot associate interrupted trajectories, resulting in frequent identification changes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Online multi-object tracking method based on unified object motion perception and re-identification network
  • Online multi-object tracking method based on unified object motion perception and re-identification network
  • Online multi-object tracking method based on unified object motion perception and re-identification network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The following examples will further illustrate the present invention in conjunction with the accompanying drawings. The present embodiment is implemented on the premise of the technical solution of the present invention, and provides implementation and specific operation process, but the protection scope of the present invention is not limited to the following implementation example.

[0040] see figure 1 , the implementation of the embodiment of the present invention includes the following steps:

[0041] A. Input the current frame image and the previous frame image to the backbone network to obtain the feature maps of the two frame images.

[0042] like figure 2 As shown, the backbone network is transformed with the DLA-34 network; the DLA-34 network is composed of an iterative depth aggregation module and a hierarchical depth aggregation module; all ordinary convolutional layers in the upsampling module of the DLA-34 network are replaced with deformable convolutio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An online multi-object tracking method that unifies object motion perception and re-identification networks, involving computer vision technology. A. Input the current frame image and the previous frame image into the backbone network to obtain the feature maps of the two frames of images; B. Send the heat map of the previous frame image and the two feature maps of step A into the detection and tracking branch to calculate Current frame detection result and target tracking offset; C. Send the feature maps of the two frames of images in step A to the re-identification branch to obtain embedded feature vectors and store them in the re-identification feature vector pool; D. According to the tracking offset, Match the detection results obtained in step B for the first time, and assign the identification of the corresponding target to the matched detection; E. Perform a second match on the unmatched detection results obtained in step D, and match the unmatched detection results with step C The obtained embedded feature vectors are calculated one by one for similarity, and according to the set threshold, different detection results are assigned identity recognition, that is, the final tracking result of the current frame is obtained.

Description

technical field [0001] The invention relates to computer vision technology, in particular to an online multi-target tracking method that unifies target motion perception and re-identification networks. Background technique [0002] The visual system is a very important way for humans to obtain external information. It can provide humans with rich resource information. As a basic task in the field of computer vision, multi-target tracking aims to estimate the trajectory of a specific category in a sequence. In recent years, research on multi-target tracking algorithms has received more and more attention. However, in dense crowds or low frame rate videos, the targets are prone to large motion offsets, mutual occlusion, overlapping, etc., resulting in limited tracking performance. Therefore, it is of great significance to study the target motion information and construct a simple and effective re-identification network for online multi-target tracking methods. [0003] The e...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06T7/62G06T7/66G06T7/73G06V40/10G06V10/44G06V10/75G06V10/764G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06T7/248G06T7/74G06T7/66G06T7/62G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06T2207/30196G06V40/103G06V10/44G06V10/751G06N3/048G06N3/045G06F18/241
Inventor 王菡子王英
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products