Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body behavior recognition method based on deep learning

A deep learning and recognition method technology, applied in the field of deep learning, can solve problems such as insufficient training data, high computing and storage overhead, and difficulty in dynamic model update, achieving good recognition results, optimized speed, and improved accuracy

Pending Publication Date: 2022-03-22
NANJING HOWSO TECH
View PDF0 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] The purpose of the present invention is to solve the problems of difficult model dynamic update, insufficient training data, and high computing and storage overhead in traditional equipment identification methods; to provide a human behavior identification method based on deep learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior recognition method based on deep learning
  • Human body behavior recognition method based on deep learning
  • Human body behavior recognition method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0071] Embodiment: the recognition method of the human behavior based on deep learning, comprises the following steps:

[0072] S1 collects data: collects data and forms a data set; the data collected in the step S1 includes 5 actions of running, walking, sitting, standing, and falling, and the data set is composed of several video clips; the data set part comes from NTU-RGB+ D, partly from on-site collection, partly from network collection, a total of 200 video clips, such as figure 2 Shown; The data set that is made of video segment (video stream) is divided into training set and test set, and training set is used for training model, and test set is used for testing;

[0073] S2 data set processing: input the data set, and perform personnel detection and tracking on the data in the data set, and extract the skeleton information of each data set through human body pose estimation, and perform pose estimation to obtain the pose estimation result, and use digital pair joints ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human behavior recognition method based on deep learning, and the method comprises the steps: S1, collecting data: collecting the data, and forming a data set; s2, data set processing: inputting data sets, performing personnel detection and tracking on data in the data sets, extracting skeleton information of each data set through human body posture estimation, and performing posture estimation to obtain a posture estimation result; and S3, dangerous behavior analysis: training and constructing an ST-GCN recognition model by using the data set, inputting the attitude estimation result into the ST-GCN recognition model for dangerous behavior analysis and recognition, obtaining a recognition result, and outputting the recognition result. According to the method, human body detection is carried out by utilizing a YOLO V4 algorithm of target detection, target tracking is carried out on a human body, a DeepSort tracking algorithm is adopted, bone joint points are extracted by utilizing OpenPose, and finally human body behavior recognition is carried out on a bone sequence by utilizing an ST-GCN recognition model.

Description

technical field [0001] The invention relates to the technical field of deep learning, in particular to a recognition method of human behavior based on deep learning. Background technique [0002] Events caused by social security, school safety, road traffic, natural or man-made disasters, etc. have brought huge loss of life and property to people. The detection, alarm and processing of events have become particularly important. If only relying on manpower to complete these tasks, It can no longer meet people's needs, which requires technical means to assist people in completing incident detection, alarm and other related work, reducing people's work burden, and improving the efficiency of incident handling at the same time. At the same time, government departments also attach great importance to such incidents and require the use of "human defense, physical defense and technical defense" to efficiently deal with or even avoid such incidents. [0003] With the development of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/20G06V20/40G06V10/40G06V10/75G06V10/774G06K9/62G06T7/246G06T7/277G06N3/04G06N3/08
CPCG06T7/248G06T7/277G06N3/08G06T2207/10016G06T2207/20081G06T2207/20084G06T2207/30196G06N3/045G06F18/22G06F18/214
Inventor 王计斌陈大龙
Owner NANJING HOWSO TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products