Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-person tracking method and system based on deep learning

A technology of deep learning and multi-person video, applied in the field of multi-target tracking, can solve problems such as inability to track well-occluded people, poor implementation flexibility, loss of observation information, etc., to ensure immersive experience, high accuracy, The effect of reducing the amount of calculation

Active Publication Date: 2021-09-07
SHANDONG UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] However, the inventors found that when the user has a long-term and complete occlusion under a single camera and a single viewing angle, the observation information will be lost
Therefore, the above-mentioned method has the problem of being unable to track the occluded person well under the situation of complete and long-term occlusion; The detection method of occluded targets based on multi-sensory data clues needs to be implemented with the help of mobile phone gyroscope sensor data and Kinect data, and its implementation flexibility is poor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-person tracking method and system based on deep learning
  • Multi-person tracking method and system based on deep learning
  • Multi-person tracking method and system based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] The purpose of this embodiment is to provide a method for tracking multiple people based on deep learning.

[0047] A multi-person tracking method based on deep learning, including:

[0048] Real-time collection of multi-person video data to be tracked;

[0049] Obtain the number of people from the current video frame, and judge whether there is occlusion in the current frame based on the switch of occlusion and the change of the number of people in adjacent frames;

[0050] If the number of people in the current video frame is equal to the number of people in the previous video frame, query the switch status of the occlusion occurrence. If it is in the unoccluded state, calculate the position of the person directly based on the built-in algorithm of kinect; The shadow of the occluded person, and solve the position of the occluded person through the shadow of the person; if the number of people in the current video frame and the previous video frame is not equal, compa...

Embodiment 2

[0128] The purpose of this embodiment is to provide a multi-person tracking system based on deep learning.

[0129] A multi-person tracking system based on deep learning, including:

[0130] A data acquisition unit, which is used for real-time collection of multi-person video data to be tracked;

[0131] An occlusion judging unit, which is used to obtain the number of people from the current video frame, and judge whether there is occlusion in the current frame based on the occlusion occurrence switch and the change of the number of people in adjacent frames;

[0132] The target tracking unit is used to query the switch state of the occlusion if the number of people in the current video frame is equal to that of the previous video frame. If it is in the unoccluded state, it will directly calculate the position of the person based on the kinect built-in algorithm; if it is in the occlusion state, it will be based on The pre-trained shadow feature model recognizes the shadow of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-person tracking method and system based on deep learning, and the method employs a pre-trained shadow feature model to recognize and track the shadow of an occluded user, solves the position of the occluded user according to the shadow of the occluded user, at the same time, under a single Kinect device, according to the captured user information, position calculation is carried out on the unshielded users and the shielded users, and real-time tracking of the positions of the multiple users is realized.

Description

technical field [0001] The present disclosure belongs to the technical field of multi-target tracking, and in particular relates to a method and system for multi-person tracking based on deep learning. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] Object tracking is an important problem in computer vision, and occlusion is a common situation in multi-object tracking. The target may be self-occluded, occluded by stationary objects in the background, and occluded by other moving targets, and the degree of occlusion is also different. How to effectively deal with occlusion, especially severe occlusion, has always been a difficult problem in multi-target tracking. Especially long-term, total occlusion is one of the most challenging forms of occlusion. [0004] At present, target tracking methods can be divided into optical-based correlat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06F18/24
Inventor 盖伟许春晓杨承磊鲍西雨栾洪秋
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products