Real-time multi-target human body 2D attitude detection system and method

A technology for real-time detection and human posture, applied in the fields of computer vision and deep learning, which can solve the problems of inaccurate estimation, unknown location and range, and unsatisfactory detection accuracy.

Inactive Publication Date: 2018-04-06
NORTHEASTERN UNIV
View PDF2 Cites 46 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The detection accuracy of existing methods in multi-target scenes is not ideal: First, each picture may contain an unknown number of people, and the location and range of these people are unknown
Secondly, the interaction between people may cause spatial interference. Since people are in contact with each other and joints overlap each other, the method of detecting the human body first may miss detection and be blocked in this case. , with incomplete information about the human body, resulting in imprecise estimates
Third, the time complexity will increase as the number of people in the picture increases, which is a huge challenge for the real-time performance of the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time multi-target human body 2D attitude detection system and method
  • Real-time multi-target human body 2D attitude detection system and method
  • Real-time multi-target human body 2D attitude detection system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. The following examples are used to illustrate the present invention, but are not intended to limit the scope of the present invention.

[0045] Such as figure 1 Shown is a structural block diagram of the multi-target human body 2D posture real-time detection system of the present invention. The detection system of the present invention includes: an image acquisition module 1 , a real-time processing module 2 and a visual display module 3 . Among them, the image acquisition module 1 is used to acquire image data; the real-time processing module 2 is used to input the image data into the neural network for learning and prediction, and according to the acquired heat map of the joint point position and the direction vector field between the joint points The heat map generates pose information of the human body...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a real-time multi-target human body 2D attitude detection system and method. The system comprises an image acquisition module used for acquiring image data, a real-time processing module used for inputting the image data to the neural network for learning and prediction and generating the human body attitude information according to a hot spot map of the acquired joint point position and a hot spot map of the direction vector field among joint points, and a visual display module used for presenting the predicted human body attitude information to users in a line connection mode. The system is advantaged in that the depth learning method is utilized to encode the joint position and the position and the direction of bones formed by joints through interconnection, accurate human body 2D attitude estimation of a single image is realized, for complex people gathering conditions, multiple human body attitudes of the scene can be accurately estimated, the users are facilitated to carry out further analysis processing and mining of the human body attitudes, and next behaviors are predicted.

Description

technical field [0001] The invention relates to the fields of computer vision and deep learning, in particular to a real-time detection system and detection method for multi-target human body 2D postures. Background technique [0002] With the development of computer vision technology and deep learning technology, human pose estimation is still an increasingly active research field of computer vision, and has broad application prospects, such as human-computer interaction, intelligent monitoring, athlete auxiliary training, video coding, etc. In recent years, driven by these applications, behavior analysis has become a research hotspot in computer vision, robotics, deep learning, machine learning, data mining and other related fields. The purpose of human behavior analysis is to describe, identify and understand human actions, interactions between people and between people and the environment. It has a wide range of application backgrounds in intelligent video surveillance, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/103G06N3/045G06F18/25G06F18/214
Inventor 卢绍文王金鑫王克栋郭章程盟盟李鹏琦赵磊刘晓丽丁进良柴天佑
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products