Multi-camera multi-target tracking method

A multi-target tracking and multi-camera technology, applied in the field of multi-camera multi-target tracking, can solve the problems of large cumulative error, low fusion accuracy, and low camera resolution, so as to improve efficiency and effectiveness, ensure real-time performance, and improve Build effects for optimized speed

Inactive Publication Date: 2019-09-10
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The present invention provides a multi-camera multi-target tracking method, which is used to solve the problem that there is a large cumulative error when the single-camera track segment fusion is performed first and then the cross-camera track fusion is performed in the existing cross-camera target tracking. The technical problem of low fusion accuracy due to the low resolution of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-camera multi-target tracking method
  • Multi-camera multi-target tracking method
  • Multi-camera multi-target tracking method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0042] A multi-camera multi-target tracking method 100, such as figure 1 shown, including:

[0043] Step 110, obtaining the initial trajectory segment set collected by multiple cameras;

[0044] Step 120, performing equalization processing on the trajectory segment similarity distribution of the initial trajectory segment set to obtain a preprocessed trajectory segment set;

[0045] Step 130 , based on the pre-processed track segment set, fuse each pre-processed track segment to obtain a complete track of each target, and complete multi-camera and multi-target tracking.

[0046] This embodiment adopts a one-step idea, but it is different from the existing method of fusion directly based on image data. The present invention puts the trajectory segments under multiple cameras together for trajectory fusion to perform global optimization and better solve the problem. If there is an error in single-camera multi-target tracking, the error will be further amplified in the subseque...

Embodiment 2

[0101] A storage medium, in which instructions are stored, and when a computer reads the instructions, the computer is made to execute any one of the above multi-camera and multi-target tracking methods.

[0102] The relevant technical solutions are the same as those in Embodiment 1, and will not be repeated here.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-camera multi-target tracking method. The multi-camera multi-target tracking method comprises the steps of collecting an initial track fragment set collected by multiplecameras; carrying out equalization processing on the trajectory fragment similarity distribution of the initial trajectory fragment set to obtain a pre-processed trajectory fragment set; and based onthe pre-processed track segment set, fusing each pre-processed track segment to obtain a complete track of each target, and completing the multi-camera multi-target tracking. According to the invention, by adopting a one-step walking idea, the track segments under the plurality of cameras are put together for track fusion; moreover, when all the tracks are fused, the equalization processing is carried out on the similarity distribution of all the track segments, so that the similarity of all the track segments corresponding to the single target is ensured to be at the same level, the efficiency and the effectiveness of multi-camera multi-target tracking are improved, and the problem that the accumulated errors exist when the single camera track segment fusion is firstly carried out and then the cross-camera track fusion is carried out, is solved to a certain extent.

Description

technical field [0001] The invention belongs to the technical field of target tracking, in particular to a multi-camera and multi-target tracking method. Background technique [0002] Accurate target tracking results allow the system to analyze target trajectories and behaviors more accurately, and help identify and retrieve specific targets. Due to the needs of actual large-scale surveillance scenarios, multi-camera-based multi-target tracking methods have attracted more and more researchers' attention in the past decade. Multi-camera object tracking refers to the joint analysis of the data under the multi-camera to obtain the trajectories of all the objects of interest under the multi-camera. Multi-camera object tracking can effectively solve the problem of limited field of view of a single camera. At the same time, the simultaneous tracking of multiple targets can realize real-time automatic monitoring within the scope of the whole scene, and analyze the trajectory and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/292G06T7/277
CPCG06T2207/10016G06T7/277G06T7/292
Inventor 桑农史广亚高常鑫熊月
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products