A cross-camera pedestrian positioning method fused integrated with a space-time model

A space-time model and cross-camera technology, applied in the field of pedestrian positioning, can solve the problems of inaccurate modeling results and low real-time pedestrian re-identification, and achieve the effect of complete and reliable models, increased fault tolerance, and increased accuracy

Active Publication Date: 2019-04-02
CHENGDU SOBEY DIGITAL TECH CO LTD
View PDF6 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0011] The purpose of the present invention is: in order to solve the problem that the existing cross-camera pedestrian positioning method has inaccurate modeling results, resulting in low real-time pedestrian re-identification, the present invention provides a cross-camera pedestrian positioning method that integrates a spatio-temporal model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A cross-camera pedestrian positioning method fused integrated with a space-time model
  • A cross-camera pedestrian positioning method fused integrated with a space-time model
  • A cross-camera pedestrian positioning method fused integrated with a space-time model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0067] like figure 1 As shown, this embodiment provides a cross-camera pedestrian positioning method fused with a spatio-temporal model, including the following steps:

[0068] S1. Establish a spatio-temporal model: Carry out comprehensive indoor and outdoor scene modeling and in-scene camera modeling for the camera deployment area in the positioning space, so that the system has complete pedestrian perception and path planning capabilities;

[0069] S2. Acquiring the pedestrian trajectory: assuming that the pedestrian is designated in the initial camera image, the walking trajectory of the designated pedestrian in the initial camera image is acquired;

[0070] S3. Select the associated camera: the specified pedestrian appears in the next camera image after walking out of the initial camera image, and the next camera is set as the associated camera, and the associated camera is selected through different strategies for the different travel routes of the specified pedestrian; ...

Embodiment 2

[0075] This embodiment is further optimized on the basis of Embodiment 1, specifically: the comprehensive indoor and outdoor scene modeling in S1 includes outdoor scene modeling and indoor scene modeling to form a complete building internal and external model for subsequent pedestrians Accurate tracking provides basic information, and the model contains complete space-time information to provide path planning capabilities. Among them,

[0076] Outdoor scene modeling: use general map services to extract the outdoor map model of the positioning space, including the GPS coordinates of each point in the map, the block type and parameters;

[0077] Indoor scene modeling: use BIM modeling tools to model the BIM model of the interior of the building. The modeling objects include beams, columns, panels, walls, stairs, elevators and doors, etc.;

[0078] like figure 2 As shown, the camera modeling in the scene in S1 includes describing camera attributes and visual area mapping, where...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a cross-camera pedestrian positioning method integrated with a space-time model, and relates to the technical field of pedestrian positioning, and the method comprises the following steps: S1, building the space-time model; S2, acquiring a pedestrian track; S3, selecting an associated camera; S4, specifying pedestrian path planning; S5, calculating the walking time: calculating the walking time of an appointed pedestrian in each path planned in the step S4; S6, pedestrian re-identification: detecting pedestrians in the time window of each associated camera, inputting the designated pedestrians in the initial camera and the pedestrians in the associated cameras into a pedestrian re-identification model for identification, and if the pedestrians with the similarity greater than a threshold exist, indicating that the pedestrians are successfully positioned; according to the pedestrian path prediction method, comprehensive modeling is carried out on the camera and the indoor and outdoor scenes deployed by the camera, the established model is complete and reliable, reasonable path planning is carried out on the walking track of the pedestrian by utilizing the model, and the accuracy of pedestrian path prediction is improved.

Description

technical field [0001] The present invention relates to the technical field of pedestrian positioning, and more specifically relates to a cross-camera pedestrian positioning method with fusion of space-time models. Background technique [0002] In the field of security monitoring, multiple cameras are often deployed in an area, and the views of the cameras complement each other to achieve the effect of monitoring the entire area at the same time. When pedestrians walk in this area, they will pass through the field of view of different cameras. How to predict pedestrians? The path and timing to reach the next camera pass is a significant topic in the pedestrian re-identification system. [0003] In the past, cross-camera pedestrian positioning methods generally only relied on the similarity of pedestrian features in the video to locate pedestrians. The range of cameras to be searched and the time window were large, so manual screening was also required. In terms of processing...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06T7/292
CPCG06T7/292G06T2207/30196G06T2207/10016G06V40/10Y02T10/40
Inventor 温序铭罗志伟管健王炜
Owner CHENGDU SOBEY DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products