HThe invention discloses a human body behavior identification method and system based on attention perception and a tree skeleton point structure

A skeleton point and attention technology, applied in the field of robot vision technology and human-computer interaction, can solve the problems of low algorithm efficiency, inappropriateness, inconvenient maintenance and improvement, etc., and achieve the effect of reducing interference, improving accuracy and efficiency

Active Publication Date: 2019-04-12
深圳市感动智能科技有限公司 +1
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the network complexity based on the LSTM structure is high. Although the detection accuracy of the network after the attention mechanism has been improved, its algorithm efficiency is low, which is inconvenient for subsequent maintenance and improvement, and is not suitable for application in actual scenarios.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • HThe invention discloses a human body behavior identification method and system based on attention perception and a tree skeleton point structure
  • HThe invention discloses a human body behavior identification method and system based on attention perception and a tree skeleton point structure
  • HThe invention discloses a human body behavior identification method and system based on attention perception and a tree skeleton point structure

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to make the above objects, features and advantages of the present invention more obvious and understandable, the present invention will be further described below through specific embodiments and accompanying drawings.

[0030] Such as figure 1 For the data reconstruction flow chart based on the three-way tree traversal rule of the present invention, the following steps are included:

[0031] Step 1, input the sequence of human skeleton points in the training set.

[0032] In graph theory, a tree is an undirected graph. Each frame of a sample sequence contains N skeleton points. These skeleton points are regarded as the nodes of the tree. The set V of these nodes is defined as:

[0033] V={v i |i=1,2,...,N}

[0034] Step 2, use the depth traversal method to traverse the skeleton point set V.

[0035] From the skeleton point set V obtained in step 1, use the depth traversal method to traverse and store the spatial relationship as α, and use the inverse depth...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body behavior identification method and system based on attention perception and a tree skeleton point structure. The method comprises the following steps: 1) inputtingskeleton point information of all behavior samples in a training set; 2) enabling the number of frames of each sample to be consistent by adding a zero-padding frame; 3) reconstructing disordered skeleton points by using a three-way tree traversal rule; 4) performing normalization processing on the Laplacian matrix of the feature map obtained after reconstruction; 5) constructing an attention perception network and a main body network; T. The behavior recognition method comprises the steps of (1) establishing a behavior recognition network model, (2) establishing a behavior recognition network model, (6) connecting the attention perception network with the main body network in a hierarchical manner, (7) inputting the reconstructed feature map into the main body network and the attention perception network respectively to train the behavior recognition network model, and (8) carrying out behavior recognition by utilizing the trained behavior recognition network model.

Description

technical field [0001] The invention belongs to the field of robot vision technology and human-computer interaction, and specifically relates to a human behavior recognition method and system based on attention perception and tree-shaped skeleton point structure; better recognition is achieved by tree-shaped reconstruction of disordered skeleton points Describe the expression behavior pattern, and use the attention network to describe the importance of different skeleton points, and provide suitable prior conditions for the classification of human behavior, which can further reduce the classification processing time of human behavior recognition and improve the accuracy of later behavior recognition . Background technique [0002] Behavior recognition belongs to the field of behavior analysis. For a given video sequence containing a certain type of motion, the video sequence is tagged according to the type of motion, which can be applied to human-computer interaction, intell...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/214Y02D10/00
Inventor 丁润伟刘畅
Owner 深圳市感动智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products