Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

People flow counting method on basis of Kinect

A counting method and people flow technology, which is applied in calculation, image data processing, computer components, etc., can solve the problems of people counting accuracy drop, image matching, and the inability to obtain human body area characteristics, etc., to achieve simple equipment and high counting accuracy Effect

Inactive Publication Date: 2014-01-22
SHANGHAI UNIV
View PDF0 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

When the human body is rotated or blocked by other objects, the characteristics of the human body area cannot be obtained in the video image, and the tracking target will be lost.
[0005] The human body tracking and counting method based on the stereo vision acquired synchronously by multiple cameras is to track and count through the spatial three-dimensional information of the human body. , the geometric shape of objects in the scene, noise interference, and camera characteristics and other factors, the images captured by the same subject at different viewpoints will have very different images, and it is difficult to unambiguously compare images from different viewpoints. Matching, part of the spatial information in the 3D scene will be lost or wrong, and the features of all human body regions cannot be obtained in the captured video images, which will lead to the loss of tracking targets and the decline in the accuracy of people counting

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • People flow counting method on basis of Kinect
  • People flow counting method on basis of Kinect
  • People flow counting method on basis of Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The Kinect-based people counting method of the present invention will be described in further detail below in conjunction with the accompanying drawings.

[0019] Such as figure 1 As shown, the involved hardware of the above-mentioned Kinect-based counting method is made up of a Kinect depth camera and a counting host installed from top to bottom; figure 2 , image 3 Shown, above-mentioned people counting method based on Kinect is characterized in that comprising the steps:

[0020] (1) Read in the current frame of the depth image through the OpenNI drive, traverse each pixel in the depth image, set the threshold T for segmenting the depth image, segment the depth image according to the segmentation threshold, and obtain a threshold segmentation map; The segmentation image is processed to eliminate noise, and the tracking object image is obtained. The specific steps are as follows:

[0021] (1-1), as described in the above step (1), read in the depth image through ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a people flow counting method on the basis of Kinect. The method comprises the following steps of: reading in a current frame of a depth image and setting a threshold value to carry out segmentation on the depth image to obtain a threshold valve segmentation image; carrying out noise removal on the threshold valve segmentation image to obtain a tracked object image; carrying out projection processing on pixels to obtain a tracked object projection image; acquiring an outline sequence of a tracked object communication region from the projection image; acquiring a head tracked object communication region from the tracked object image to form a current frame track point set; judging tracked object affiliation of each track point and updating or newly establishing a track point set of each tracked object; judging whether the track point sets of all the tracked objects meet the tracking counting conditions and updating an entrance and exit counter; and removing the track point sets of the tracked objects who leave from a scene. The method is suitable for people flow counting at various passages, has high counting accuracy, is low in equipment complexity, is not influenced by light variation, shadow, a perspective effect and shielding of the scene and is suitable for people flow counting in different environment places.

Description

technical field [0001] The invention relates to a method for tracking and counting video objects, in particular to a method for counting people flow based on Kinect stereo vision. This method uses a Microsoft Kinect 3D stereoscopic camera as the video input, which can be used for dynamic capture of depth images to achieve real-time head tracking and counting, and is especially suitable for counting people at the entrances and exits of public places. Background technique [0002] The counting of people at the entrance and exit of the passageway based on machine vision is based on the scene image captured by the camera, through the processing of the scene image to detect the people in the scene image, and count the number of people passing through the entrance and exit of the passageway. For example, by installing a camera at the entrance and exit of the exhibition hall, the number of people in the exhibition hall can be estimated in real time, and the degree of congestion in ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06K9/00
Inventor 朱秋煜所文俊王锦柏陈波袁赛王国威徐建忠
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products