People flow counting method and system

A frame number and positive integer technology, applied in the field of object counting equipment and a computer-readable storage medium, can solve problems such as error detection and counting repetition, posture changes, environmental occlusion or scene asymmetry, and large amount of calculations

Active Publication Date: 2019-11-01
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] After the implementation of the deep learning people flow counting scheme, the effect of picture detection and tracking in actual scenes has been greatly improved, and a detection and tracking scheme using convolutional neural networks has emerged. To track the detection frame, although the counting accuracy has been improved, the calculation amount is much larger and the cost is significantly increased
[0005] Generally, people counting schemes based on deep learning have thorny technical problems such as error detection and repeated or invalid counting; error detection problems are usually caused by changes in human body posture, environmental occlusion, or scene asymmetry, and may be detected in real-time video images. The phenomenon of "flicke...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • People flow counting method and system
  • People flow counting method and system
  • People flow counting method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0083] An embodiment of the present invention provides an object tracking method, and the method includes:

[0084] S1) Perform object detection on an image set with a frame sequence according to the frame sequence, and when a first image with a first object set is detected in the image set, obtain in the first image the same the first local vector corresponding to the position of the first object set relative to the first image, and then obtain the second image relative to the second object set in the second image with the second object set in the image set The second local vector corresponding to the position of , wherein the frame sequence of the first image is smaller than the frame sequence of the second image, and there is at least one object in the first object set;

[0085] S2) Obtain a set of tracking values ​​according to the first local vector and the second local vector in combination with a preset tracking mapping relationship, and then obtain the set of tracking ...

Embodiment 2

[0137] Based on the object tracking method in Embodiment 1, the present invention also provides a method for counting objects by using the object tracking method, the method comprising:

[0138] S1) Acquire a tracking state set, wherein the tracking state set has a tracking state of at least one object, a frame sequence image set corresponding to the tracking state, and a frame sequence image set corresponding to each frame image in the frame sequence image set. The local vector corresponding to each object;

[0139] S2) Select a partial image area of ​​each frame image in the set of frame-sequential images as a state identification area, and compare all local vectors corresponding to each object in the set of tracking states relative to all the local vectors in the frame order of the set of frame-sequential images. Described state identification area compares the position of each local vector, obtains the object position information set with order;

[0140] S3) according to ...

Embodiment 3

[0148] Based on Examples 1 and 2, as figure 1 and Image 6 , an embodiment of the present invention provides a system for counting objects by using an object tracking method, the system includes: one or more servers; also includes: one or more detection ends; the server and the detection end encrypt data through the Internet (Internet). interact;

[0149] Image 6The real picture of the in-store shopping room is derived from the people flow counting system built by the present invention; the detection end is used to form the tracking state aggregate data, the detection end has a charge-coupled device and a processor for image capture, and the processor and the charge-coupled device can be in the same place. In the device, such as an intelligent detection camera (with certain computing power), the processor can also be located in an independent detection end server, the charge-coupled device is located in the camera, and the camera transmits image data to the detection end se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a people flow counting method and system, and belongs to the technical field of image processing. The method comprises the following steps of: detecting the object of the imageset with the frame sequence according to the frame sequence; when a first image with a first object set is detected in the image set, obtaining a first local vector corresponding to the position of the first object set relative to the first image in the first image, and then obtaining a second local vector corresponding to the position of the second object set relative to the second image in the second image with the second object set in the image set; and obtaining a tracking value set according to the first local vector and the second local vector in combination with a preset tracking mapping relationship, and obtaining a tracking state set of objects in the first object set according to a relationship between each tracking value in the tracking value set and a preset tracking thresholdcondition. According to the invention, detection and tracking of high on-call rate are realized, and a high-precision and high-robustness people flow counting system is realized based on the detectionand tracking of the high on-call rate.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to an object tracking method, a method for counting objects using the object tracking method, a system for counting objects using the object tracking method, a device for counting objects and A computer-readable storage medium. Background technique [0002] With the continuous maturity and implementation of computer vision counting, the understanding of human behavior in real scenes has increasingly become the focus of users. Taking security scenes as an example, counting the characteristics of people passing by has become its core function; in addition, retail, supermarkets and stations, etc. There is also a demand for the function of counting passenger flow in the scene, so as to feedback the regularity of customers arriving at the store, and to count the queuing time of the checkpoint. In the actual people counting project, there are often requirements that the solution...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/246
CPCG06T7/246G06T2207/10016G06T2207/30196G06T2207/30242G06T2207/30232
Inventor 张成月
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products