Behavior identification method based on local joint point track space-time volume in skeleton sequence

A recognition method and joint point technology, applied in character and pattern recognition, computer parts, instruments, etc., can solve the problems of trajectory redundancy, difficult time information, difficult to apply to real scenes, etc., and achieve low cost and high precision. Effect

Active Publication Date: 2019-12-10
HUAQIAO UNIVERSITY
View PDF6 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] (1) The skeleton-based behavior recognition method using depth information is difficult to apply to real scenes due to the high cost of depth detectors and low accuracy when applied to real outdoor scenes with complex scenes
[0011] (2) It is difficult to model the time information in the skeleton recognition method using global joint trajectory calculation features
[0012] (3) The iDT method requires dense sampling and tracking of interest points in the human body area, and a large number of samples make the trajectory redundant
[0016] (2) Since the dimension of the feature depends on the length of the video and the length of the video is different, this leads to the length of the trajectory of the joint points of each video and its feature dimension are not the same

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior identification method based on local joint point track space-time volume in skeleton sequence
  • Behavior identification method based on local joint point track space-time volume in skeleton sequence
  • Behavior identification method based on local joint point track space-time volume in skeleton sequence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0118] The present invention uses RGB video data and 2D human skeleton data for behavior recognition. The method flow proposed by the present invention follows the classic behavior recognition flow based on local features: detection of spatiotemporal interest points, feature extraction, construction of bag-of-words model, and classification. Specifically, it is divided into four steps: extracting local joint point trajectory space-time volume (LJTV), feature extraction, feature encoding, and behavior classification. Schematic as figure 2 As shown, each step is described in detail below:

[0119] Step 1, extract the space-time volume of local joint point trajectories:

[0120] The human skeleton contains 15-25 joint points, and different data have different joint point quantities, but the algorithm of the present invention is not restricted by the joint point quantity.

[0121] The present invention takes the human skeleton with 20 joint points as an example, the structure ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of artificial intelligence, and discloses a behavior recognition method based on a local articulation point trajectory space-time volume in a skeleton sequence, and the method comprises the steps: extracting the local articulation point trajectory space-time volume from inputted RGB video data and skeleton articulation point data; extracting image features by using a pre-training model based on the RGB video data set; constructing a codebook for each different feature of each joint point in the training set and encoding the codebook, and connectingthe features of the n joint points in series to form a feature vector; and performing behavior classification and recognition by using an SVM classifier. According to the method, manual features and deep learning features are fused, local features are extracted by using a deep learning method, and fusion of multiple features can achieve a stable and accurate recognition rate. According to the invention, the 2D human body skeleton estimated by the attitude estimation algorithm and the RGB video sequence are used to extract the features, the cost is low, the precision is high, and the method hasimportant significance when applied to a real scene.

Description

technical field [0001] The invention belongs to the technical field of artificial intelligence, in particular to a behavior recognition method in a skeleton sequence based on the space-time volume of local joint point trajectories. Specifically, it is a behavior recognition method based on the space-time volume of local joint point trajectories in RGB and 2D skeleton sequences. Background technique [0002] At present, the existing technologies commonly used in the industry are as follows: [0003] With the development of artificial intelligence technology and the increasing investment from the government and industry, the artificial intelligence industry is booming and has become a hot spot in scientific research today. The popularization of artificial intelligence applications has an increasingly prominent impact on society, and has a positive impact on people's livelihood fields such as smart transportation, smart home, and smart medical care. As the core force of the n...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06K9/46
CPCG06V40/20G06V10/50G06F18/2451G06F18/241G06F18/253G06F18/214
Inventor 张洪博张翼翔杜吉祥雷庆
Owner HUAQIAO UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products