Unlock instant, AI-driven research and patent intelligence for your innovation.

Multi-target tracking method based on cross-task mutual learning, terminal equipment and medium

A multi-target tracking and task technology, applied in neural learning methods, image analysis, image enhancement, etc., can solve the problem of feature misalignment for detection and re-identification requirements, achieve efficient multi-target tracking, improve difficulty, and high accuracy Effect

Active Publication Date: 2022-03-25
慧镕电子系统工程股份有限公司
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this joint network still faces many problems: (1) The features required for detection and re-identification are actually misaligned. The former needs boundary-aware information while the latter needs identity-aware information. How to ensure that the two It is very important for each task to obtain their own demand information; (2) For the re-identification task, the information of other targets and the environment around the target is crucial to enhance the discrimination of the target itself. The recognition task is very critical; (3) detection and re-identification can complement each other at the task level, how to achieve this complementarity is the main problem in building a more efficient multi-branch model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-target tracking method based on cross-task mutual learning, terminal equipment and medium
  • Multi-target tracking method based on cross-task mutual learning, terminal equipment and medium
  • Multi-target tracking method based on cross-task mutual learning, terminal equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments, so that those skilled in the art can better understand the present invention and implement it, but the examples given are not intended to limit the present invention.

[0052] like figure 2 As shown, it is the overall frame diagram of the multi-target tracking method based on cross-task mutual learning in the embodiment of the present invention. The method takes a single frame in a video sequence as input, and the model includes a feature extraction network, a mutual learning head network and data association 3 parts. Among them, the feature decoupling module CBD acts on the feature extraction network part, and the cross direction Transformer CDF acts on the mutual learning head network. The detection branch and the re-identification branch interact with each other through the transmission of statistical information to obtain optimized output. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-target tracking method based on cross-task mutual learning. The method comprises the following steps: S1, reading a current frame RGB image of a video sequence; s2, extracting universal features of the RGB image of the current frame by using a backbone network; s3, performing feature decoupling on the general features of the current frame of RGB image to obtain detection alignment embedding and re-identification alignment embedding; s4, obtaining a detection basic output according to the detection alignment embedding; s5, a cross direction Transform is used for obtaining re-identification output according to re-identification alignment embedding; s6, performing cross-task interactive learning on the detection basic output and the re-identification output to obtain the detection basic output and the re-identification output after interaction; s7, obtaining a target position frame on the current frame according to the interacted detection basic output, and querying a re-identification embedding on the interacted re-identification output according to the position of the central point; and S8, carrying out inter-frame association on the current frame and the previous frame according to re-identification embedding. According to the invention, efficient multi-target tracking can be realized in a complex scene.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a multi-target tracking method based on cross-task mutual learning. Background technique [0002] With the extensive theoretical research on computer vision and the business needs in practical scenarios, object tracking, especially multi-object tracking, has gradually become an important field. Due to the complexity of real scenes and the variability of pedestrian motion, there are still many problems to be solved in pedestrian multi-target tracking. At present, multi-target tracking is mainly divided into two-stage methods that first detect targets and then extract target features, and single-stage methods that simultaneously extract detection and re-identification features. For a long time when multi-task joint learning has not been popular, most research on multi-target tracking is two-stage, but both stages of the two-stage method need to use deep neural networks to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/246G06N3/04G06N3/08
CPCG06T7/246G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06N3/048G06N3/045
Inventor 蒋敏周晨孔军
Owner 慧镕电子系统工程股份有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More