Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-stream fusion-based skeleton graph human body behavior identification method and system

A recognition method and behavior technology, applied in the field of behavior recognition, can solve problems such as limited motion information of joint points, influence on the accuracy of behavior recognition, and difficult variety of behaviors, so as to improve long-distance spatial perception, accurately predict action categories, enhance The effect of motion characteristics

Pending Publication Date: 2022-07-05
XI AN JIAOTONG UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If you only focus on adjacent joints, the distance between the two hands is very far in the skeletal point diagram, and it is difficult for the algorithm to capture long-distance dependencies, which will affect the accuracy of some behavior recognition
[0006] The second point is that the graph convolution structure of each layer in this method is basically similar, and the multi-layer stacking layer is deep, and the characteristics of each layer and channel are treated equally, lacking the flexibility and adaptability to focus on features.
With a fixed network, it is difficult to notice the different concerns of different actions, and it is difficult to obtain optimal modeling for multiple types of behaviors
[0007] The third point is that this method only uses the joint point information of the human body in the modeling and experiments. Although the space-time connection and time connection are considered in the construction of the space-time skeleton diagram, the motion information expressed by the joint points is still limited, which may there will be missing information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-stream fusion-based skeleton graph human body behavior identification method and system
  • Multi-stream fusion-based skeleton graph human body behavior identification method and system
  • Multi-stream fusion-based skeleton graph human body behavior identification method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0100] A method for human behavior recognition based on multi-stream fusion, comprising the following steps:

[0101] S1. Extract four different data streams from video skeleton data: joint stream, skeleton stream, joint motion stream and skeleton motion stream. The specific workflow is as follows:

[0102] (1.1), using a set of public skeleton point data sets and a set of public RGB data sets;

[0103] (1.2) For the RGB data set, the public OpenPose toolbox is used to extract the 18 joint points of each frame of the human body, which are represented by coordinates and integrated into a data format that conforms to the network input;

[0104] (1.3), for the bone data in (1.1) and (1.2), four different preprocessing methods are used to divide the data into four different data streams, such as Image 6 As shown, they are joint flow, bone flow, joint motion flow and bone motion flow.

[0105] S2. Process the acquired different data streams, and divide the training set and the ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a skeleton graph human body behavior identification method and system based on multi-stream fusion, and the method comprises the steps: extracting four different data streams from video skeleton data, carrying out the network model training of the four different data streams, and obtaining four different training models; according to the human body behavior recognition method, human body skeleton point data is used as input, four different data streams are processed for model training, so that the network is more sensitive to channel information expressing different motions, and the human body behavior recognition efficiency is improved. Motion features are enhanced, and a behavior recognition model is trained through stacking of a multi-layer space-time diagram convolutional network structure; and finally, model results of four-stream training are fused, so that model outputs are mutually reinforced, and therefore, behavior action categories can be predicted more accurately.

Description

technical field [0001] The invention belongs to the technical field of behavior recognition, and in particular relates to a method and system for recognizing human behavior in a skeleton map based on multi-stream fusion. Background technique [0002] Humans have various behaviors and actions in daily life, which contain rich information. With the advent of the era of big data, massive pictures and videos have become the main carrier of information dissemination, and how to understand human behavior has become an important issue in the field of computer vision. Behavior recognition technology can be applied to human-computer interaction, intelligent monitoring, anomaly detection and other fields, and has strong application value and research significance. [0003] Compared with RGB data, the skeleton point data sequence has a clearer appearance representation, and the information expression of the human body structure is more intuitive. At the same time, the movement of huma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/20G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 田智强王晨宇岳如靖杜少毅
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products