Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Kinect-based people counting method

A counting method and people flow technology, which is applied in calculation, image data processing, computer components, etc., can solve the problems of people counting accuracy drop, image matching, tracking target loss, etc., and achieve the effect of simple equipment and high counting accuracy

Inactive Publication Date: 2016-06-29
SHANGHAI UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When the human body is rotated or blocked by other objects, the characteristics of the human body area cannot be obtained in the video image, and the tracking target will be lost.
[0005] The human body tracking and counting method based on the stereo vision acquired synchronously by multiple cameras is to track and count through the spatial three-dimensional information of the human body. , the geometric shape of objects in the scene, noise interference, and camera characteristics and other factors, the images captured by the same subject at different viewpoints will have very different images, and it is difficult to unambiguously compare images from different viewpoints. Matching, part of the spatial information in the 3D scene will be lost or wrong, and the features of all human body regions cannot be obtained in the captured video images, which will lead to the loss of tracking targets and the decline in the accuracy of people counting

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Kinect-based people counting method
  • Kinect-based people counting method
  • Kinect-based people counting method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The Kinect-based people counting method of the present invention will be described in further detail below in conjunction with the accompanying drawings.

[0019] like figure 1 As shown, the involved hardware of the above-mentioned Kinect-based counting method is made up of a Kinect depth camera and a counting host installed from top to bottom; figure 2 , image 3 Shown, above-mentioned people counting method based on Kinect is characterized in that comprising the steps:

[0020] (1) Read in the current frame of the depth image through the OpenNI drive, traverse each pixel in the depth image, set the threshold T for segmenting the depth image, segment the depth image according to the segmentation threshold, and obtain a threshold segmentation map; The segmentation image is processed to eliminate noise, and the tracking object image is obtained. The specific steps are as follows:

[0021] (1-1), as described in the above step (1), read in the depth image through the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for counting people flow based on Kinect. The steps of the method include: reading in the current frame of the depth image, setting a threshold to segment the depth image to obtain a threshold segmentation map; denoising the threshold segmentation map to obtain a tracking object map; performing projection processing on pixels to obtain a tracking object projection map; Obtain the contour sequence of the connected area of ​​the tracking object in the projection map; obtain the connected area of ​​the head tracking object in the tracking object graph to form a set of track points in the current frame; determine the track object attribution of each track point, update or create the track point of the tracked object Set; determine whether the track point set of all tracking objects meets the tracking counting condition, update the in-out counter; exclude the set of track points that have left the scene tracking object. This method is suitable for counting people at various passages, has high counting accuracy, low equipment complexity, is not affected by scene light changes, shadows, perspective effects and occlusions, and is suitable for counting people in different environments.

Description

technical field [0001] The invention relates to a method for tracking and counting video objects, in particular to a method for counting people flow based on Kinect stereo vision. This method uses Microsoft Kinect3D stereoscopic camera as video input, which can be used for dynamic capture of depth images to realize real-time head tracking and counting, especially suitable for counting people at the entrances and exits of public places. Background technique [0002] The counting of people at the entrance and exit of the passageway based on machine vision is based on the scene image captured by the camera, through the processing of the scene image to detect the people in the scene image, and count the number of people passing through the entrance and exit of the passageway. For example, by installing a camera at the entrance and exit of the exhibition hall, the number of people in the exhibition hall can be estimated in real time, and the degree of congestion in the exhibition...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/00G06K9/00
Inventor 朱秋煜所文俊王锦柏陈波袁赛王国威徐建忠
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products