Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Cross-camera Pedestrian Trajectory Matching Method

A cross-camera and camera technology, applied in the field of cross-camera pedestrian trajectory matching, can solve problems such as measurement errors and low tracking accuracy, and achieve the effect of solving trajectory point alignment

Active Publication Date: 2019-10-15
SUN YAT SEN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In order to solve the defects of low tracking accuracy or introduction of measurement errors caused by differences in viewing angles when performing cross-camera tracking in the above method, the present invention provides a cross-camera pedestrian trajectory matching method, which does not use the visual appearance of pedestrians The features are matched across cameras, but by matching the most similar trajectories of different cameras to determine whether they belong to the same pedestrian, which solves the problem of cross-camera pedestrian tracking; and because the global motion pattern features are shared, each trajectory in the global motion pattern The weights on can be directly used to calculate the similarity, thus solving the problem of trajectory point alignment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Cross-camera Pedestrian Trajectory Matching Method
  • A Cross-camera Pedestrian Trajectory Matching Method
  • A Cross-camera Pedestrian Trajectory Matching Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] Such as figure 1 Described, the method provided by the invention specifically comprises the following steps:

[0037] Using {x}=(x 1 ,x 2 ,...,x L ) to represent a pedestrian trajectory, where x i Represents the observed value, L is the length of the trajectory, and each observed value x i It is composed of the position information of the current point and the motion direction information between adjacent points, where the value range of the position information is 160×120, and the motion direction information between adjacent points is divided into four directions: up, down, left, and right, that is, each observation value is is a three-dimensional vector, x i ∈R 3 .

[0038] On the above basis, the method provided by the invention comprises the following steps:

[0039] S1. Extract a pedestrian trajectory of the target camera as the target trajectory, and then use all the trajectories of the remaining cameras within the time period as candidate trajectories; ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a cross-camera pedestrian trajectory matching method, comprising the following steps: S1. Extracting a pedestrian trajectory of a target camera as a target trajectory, and then using all pedestrian trajectories that appear in the time period of other cameras as candidate trajectories; S2. Use the Chinese chain restaurant process to train the hierarchical Dirichlet process, extract the global motion pattern features of all trajectories, and simultaneously obtain the feature weights of the target trajectory and each candidate trajectory on the global motion pattern features; S3. Calculate the target trajectory feature weight and The cosine distance between the feature weights of each candidate trajectory is used as a similarity measure, and then the candidate trajectory with the smallest cosine distance is used as the matching trajectory of the target trajectory.

Description

technical field [0001] The present invention relates to the technical field of image processing, and more specifically, to a cross-camera pedestrian trajectory matching method. Background technique [0002] With the development of computer technology and image processing technology, video-based intelligent monitoring systems have been widely used, among which pedestrian tracking technology plays a huge role in commercial passenger flow analysis, social public security monitoring and other fields. Due to the limited field of view of a single camera, in order to expand the field of view and realize long-distance tracking of pedestrians, pedestrian tracking technology based on multiple cameras has attracted much attention. A key technical problem in realizing multi-camera pedestrian tracking is matching and associating pedestrians within the field of view of different cameras. [0003] The traditional pedestrian tracking technology mainly relies on the apparent features of ped...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/292G06T7/246G06K9/62
CPCG06T2207/30241G06F18/22
Inventor 潘子潇谢晓华尹冬生
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products