Human body action recognition method based on a TP-STG framework

A human action recognition, TP-STG technology, applied in the direction of character and pattern recognition, instruments, computer components, etc., can solve the problems of poor human action recognition effect and large error, to prevent early divergence and instability, and reduce prediction loss effect

Active Publication Date: 2019-03-19
CHINA UNIV OF PETROLEUM (EAST CHINA)
View PDF4 Cites 50 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a human action recognition method based on the TP-STG framework, which solves the problem of poor human action recognition effect and large error in complex scenes in the

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action recognition method based on a TP-STG framework
  • Human body action recognition method based on a TP-STG framework
  • Human body action recognition method based on a TP-STG framework

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] The technical solutions of the present invention will be further described below in conjunction with the drawings and embodiments.

[0075] A human action recognition method based on the TP-STG framework, such as figure 1 Shown is the structural flow diagram of the human action recognition method based on the TP-STG framework of the present invention, and the method includes:

[0076] (S100) Using video information as input, performing feature extraction, adding prior knowledge to the SVM classifier, and proposing posterior discriminant criteria to remove non-human objects;

[0077] (S200) Segment the human target through the target positioning and detection algorithm, and output it in the form of target frame and coordinate information, and use feature selection to provide information for human body key point detection;

[0078] (S300) Using the improved posture recognition algorithm to perform body part location and correlation degree analysis to detect all key point...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body action recognition method based on a TP-STG framework, which comprises the following steps: taking video information as input, adding priori knowledge into an SVMclassifier, and providing a posteriori discrimination criterion to remove a non-personnel target; segmenting a personnel target through a target positioning and detection algorithm, outputting the personnel target in a target frame and coordinate information mode, and providing input data for human body key point detection; utilizing an improved posture recognition algorithm to carry out body partpositioning and correlation degree analysis so as to extract all human body key point information and form a key point sequence; a space-time graph is constructed on a key point sequence through an action recognition algorithm, the space-time graph is applied to multi-layer space-time graph convolution operation, action classification is carried out through a Softmax classifier, and human body action recognition in a complex scene is achieved. According to the method, the actual scene of the ocean platform is combined for the first time, and the provided TP-STG framework tries to identify worker activities on the offshore drilling platform for the first time by using methods of target detection, posture identification and space-time diagram convolution.

Description

technical field [0001] The invention belongs to the field of computer vision and image processing, and relates to a human action recognition method based on a TP-STG framework. Background technique [0002] With the popularization and wide application of surveillance cameras, massive video data has brought enormous pressure to manual recognition. Using artificial mode to analyze and judge video data, the bottleneck of manpower, experience and analysis capabilities of surveillance personnel restricts the application of intelligent behavior discrimination. overall effectiveness. In recent years, with the continuous advancement of research, the research on human action recognition has made some progress. The traditional methods are template matching, three-dimensional analysis and time series, which are the most common three methods, but the amount of calculation is relatively large, it is susceptible to noise interference, lacks robustness, and considers the integrity of acti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/2411Y02T10/40
Inventor 宫法明马玉辉宫文娟
Owner CHINA UNIV OF PETROLEUM (EAST CHINA)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products