Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A two-person interaction behavior identification method based on the priori knowledge

A priori knowledge and recognition method technology, applied in the field of two-person interactive behavior recognition, can solve the problems of insufficient recognition accuracy of two-person or even single-person behavior, low recognition accuracy of two-person interactive behavior, and inability to effectively extract key features, etc., to reduce a large number of Redundant information, improved recognition effect, and low power consumption

Active Publication Date: 2019-03-08
XIDIAN UNIV
View PDF10 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The LSTM method converts bone data into one-dimensional vector processing, which destroys the original spatial structure, making the accuracy of double or even single-person behavior recognition insufficient.
[0005] Although the CNN method has a powerful feature extraction ability and has improved the accuracy of behavior recognition problems, it also requires a fixed-size convolution kernel to traverse the process, which cannot effectively extract key features and has a large computational complexity. When dealing with two-person tasks still does not meet the accuracy requirements
[0006] The GCN method is to extend CNN to the general graph structure, making data processing and feature extraction more free and flexible. The composition of the skeleton data includes skeleton points and connection relationships, which belong to the graph structure. When GCN processes the skeleton sequence, it generally follows the human body's natural The skeleton relationship is connected, the computational complexity is low, and the accuracy of single-person behavior recognition is very good, but the two-person interaction task needs to focus on the most important parts of the two people, and does not require too much single-person information, so it is not necessary. It is very suitable for the recognition of two-person interactive behavior, resulting in the lack of low precision in the recognition of two-person interactive behavior

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A two-person interaction behavior identification method based on the priori knowledge
  • A two-person interaction behavior identification method based on the priori knowledge
  • A two-person interaction behavior identification method based on the priori knowledge

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The present invention will be described in detail below in conjunction with the accompanying drawings and examples.

[0036] refer to figure 1 , the implementation steps of the present invention are as follows:

[0037] Step 1, prepare the behavior recognition network ST-GCN network structure file and related files.

[0038] 1a) Download the relevant files of behavior recognition network ST-GCN from the github website, which includes: structure files graph.py, tgcn.py and st_gcn.py, parameter setting files train.yaml and test.yaml, generate data sets and data Label code file ntu_gendata.py, training code files processor.py, recognition.py and io.py, visualization parameter setting file demo.yaml, visualization code file demo.py;

[0039] 1b) Download the two-person interaction data set from the SBU website. There are 282 interactive action data in the SBU data set, including 8 types of two-person interaction actions, the categories are: "handshake", "handing objects",...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a two-person interaction behavior identification method based on the priori knowledge, which mainly solves the problem that the prior art cannot accurately identify the two-person interaction behavior. The implementation scheme comprises preparing the basic behavior recognition network ST-GCN network structure files and related files; 2 establishing a priori knowledge connection relationship by each type of interactive action, and modifying that network structure file and the train parameter file according to the connection relationship of the priori knowledge; 3 usingthe modified file to train the two-person interactive behavior recognition network to get the trained model; 4 using the trained model to recognize the existing data, the data extracted by Kinect or the data collected by openpose. The method of the invention improves the recognition accuracy of the two-person interaction behavior, has strong adaptability and good real-time performance, and can beused for video monitoring and video analysis.

Description

technical field [0001] The invention belongs to the technical field of pattern recognition, mainly relates to the recognition of two-person interactive behavior, and can be used for the classification and detection of two-person actions in videos. Background technique [0002] At present, the behavior recognition method based on deep learning has developed rapidly, but it has certain limitations for two-person interaction. Unlike people who directly focus on the most important parts when observing actions, the general method analyzes each person individually and then identifies them, which is complex and includes a lot of redundant information. This makes it necessary to introduce human prior knowledge to guide two-person interactive action recognition. [0003] Behavior recognition is a very important issue in the field of video analysis and detection, and has huge application prospects. Human skeleton data has great advantages in behavior recognition, and it can not be i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V20/40G06N3/045G06F18/214G06F18/24
Inventor 谢雪梅陈建宇石光明李佳楠金楷
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products