Human body action recognition method in monitoring based on deep learning and posture estimation

A technology for human action recognition and attitude estimation, applied in character and pattern recognition, computing, computer parts and other directions, can solve the problems of low efficiency, time-consuming and labor-intensive monitoring methods, and achieve the effect of improving efficiency and saving costs.

Active Publication Date: 2019-09-10
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF7 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to address the above-mentioned deficiencies in the prior art, to provide a human body action recognition method in monitoring based on deep learning and posture estimation, to solve the original The monitoring method is time-consuming, laborious and inefficient

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method in monitoring based on deep learning and posture estimation
  • Human body action recognition method in monitoring based on deep learning and posture estimation
  • Human body action recognition method in monitoring based on deep learning and posture estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] The specific embodiments of the present invention are described below so that those skilled in the art can understand the present invention, but it should be clear that the present invention is not limited to the scope of the specific embodiments. For those of ordinary skill in the art, as long as various changes Within the spirit and scope of the present invention defined and determined by the appended claims, these changes are obvious, and all inventions and creations using the concept of the present invention are included in the protection list.

[0043] According to an embodiment of the present application, refer to figure 1 , the human action recognition method in monitoring based on deep learning and attitude estimation of this program, including:

[0044] S1. Construct a multi-flow action recognition model based on the representation of joint features of relative parts;

[0045] S2. Preprocessing the human skeleton movement data and converting the joint feature ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body action recognition method in monitoring based on deep learning and posture estimation. The method comprises the following steps: constructing a multi-flow action recognition model based on relative part joint feature representation; preprocessing human body skeleton action data and converting relative part joint feature representation; inputting the converted relative part joint feature representation into a multi-flow identification model for model training and evaluation, and selecting an optimal model with the highest convergence identification rate after multiple rounds of iteration; obtaining a monitoring segment in a real-time scene of the monitoring video, obtaining a skeleton action sequence of a human body in the monitoring segment by adoptingan attitude estimation algorithm, and preprocessing the skeleton action sequence; performing feature representation conversion on the preprocessed skeleton action sequence; using the optimal model toidentify human body actions in the bone action sequence after preprocessing and feature representation conversion, and obtaining an action classification result; and comparing the identified classification result with a preset dangerous action category, and returning a comparison result to the monitoring worker.

Description

technical field [0001] The invention belongs to the technical field of computer image processing and action recognition, and in particular relates to a human action recognition method in monitoring based on deep learning and attitude estimation. Background technique [0002] With the advancement of the "Safe City" plan and the reduction in the cost of surveillance cameras, surveillance cameras are all over the streets and alleys, monitoring public scenes in real time, providing protection for public safety, but a large amount of surveillance video data has been generated. Most of the surveillance cameras in China are monitored by humans in real time. Multiple monitoring images are gathered in the same monitoring room, and then the manual real-time monitoring images are used to judge whether dangerous behaviors have occurred. This primitive monitoring method is time-consuming, laborious and inefficient. [0003] It is a more efficient monitoring method to use computers to aut...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/23G06V40/20G06V20/52G06N3/044G06N3/045G06F18/24
Inventor 秦臻张扬丁熠秦志光
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products