Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cross-camera pedestrian track matching method

A cross-camera, matching method technology, applied in image enhancement, image analysis, image data processing, etc., can solve the problems of measurement error and low tracking accuracy, and achieve the effect of solving track point alignment

Active Publication Date: 2017-06-23
SUN YAT SEN UNIV
View PDF5 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In order to solve the defects of low tracking accuracy or introduction of measurement errors caused by differences in viewing angles when performing cross-camera tracking in the above method, the present invention provides a cross-camera pedestrian trajectory matching method, which does not use the visual appearance of pedestrians The features are matched across cameras, but by matching the most similar trajectories of different cameras to determine whether they belong to the same pedestrian, which solves the problem of cross-camera pedestrian tracking; and because the global motion pattern features are shared, each trajectory in the global motion pattern The weights on can be directly used to calculate the similarity, thus solving the problem of trajectory point alignment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cross-camera pedestrian track matching method
  • Cross-camera pedestrian track matching method
  • Cross-camera pedestrian track matching method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] Such as figure 1 Described, the method provided by the invention specifically comprises the following steps:

[0037] Using {x}=(x 1 ,x 2 ,...,x L ) to represent a pedestrian trajectory, where x i Represents the observed value, L is the length of the trajectory, and each observed value x i It is composed of the position information of the current point and the motion direction information between adjacent points, where the value range of the position information is 160×120, and the motion direction information between adjacent points is divided into four directions: up, down, left, and right, that is, each observation value is is a three-dimensional vector, x i ∈R 3 .

[0038] On the above basis, the method provided by the invention comprises the following steps:

[0039] S1. Extract a pedestrian trajectory of the target camera as the target trajectory, and then use all the trajectories of the remaining cameras within the time period as candidate trajectories; ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a cross-camera pedestrian track matching method. The method comprises the following steps of S1, extracting one pedestrian track of a target camera as a target track, and taking all pedestrian tracks of other cameras in this time slot as candidate tracks; S2, using the Chinese chain restaurant process to train and layer the Dirichlet process, extracting the global motion pattern characteristics of all tracks, and obtaining the characteristic weights of the target track and each candidate track in the global motion pattern; and S3, calculating the cosine distance between the characteristic weight of the target track and the characteristic weight of each candidate track as the similarity measurement, and then selecting the candidate track with the smallest cosine distance as the matching track of the target track.

Description

technical field [0001] The present invention relates to the technical field of image processing, and more specifically, to a cross-camera pedestrian trajectory matching method. Background technique [0002] With the development of computer technology and image processing technology, video-based intelligent monitoring systems have been widely used, among which pedestrian tracking technology plays a huge role in commercial passenger flow analysis, social public security monitoring and other fields. Due to the limited field of view of a single camera, in order to expand the field of view and realize long-distance tracking of pedestrians, pedestrian tracking technology based on multiple cameras has attracted much attention. A key technical problem in realizing multi-camera pedestrian tracking is matching and associating pedestrians within the field of view of different cameras. [0003] The traditional pedestrian tracking technology mainly relies on the apparent features of ped...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/292G06T7/246G06K9/62
CPCG06T2207/30241G06F18/22
Inventor 潘子潇谢晓华尹冬生
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products