A Human Action Recognition Method Based on Lie Group Features and Convolutional Neural Networks

A technology of convolutional neural network and human action recognition, which is applied in character and pattern recognition, instruments, computing, etc., can solve the problem of high recognition accuracy, achieve strong robustness, accurate and effective description

Active Publication Date: 2022-07-01
北京陟锋科技有限公司
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] In view of this, the object of the present invention is to provide a human body action recognition method based on Lie group features and convolutional neural network, which greatly overcomes the interference of traditional technologies on external environment changes and human body shape changes, etc., and can be compared It overcomes the defects that some action recognition methods based on traditional Euclidean space cannot simulate and express the spatial complexity and geometric relationship of human actions; at the same time, this method can better deal with the similarity between actions and the high variability between classes; In terms of computing cost and recognition effect, using convolutional neural network to process features can not only learn and classify features well, but also reduce computing cost to a large extent; the recognition accuracy is high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Human Action Recognition Method Based on Lie Group Features and Convolutional Neural Networks
  • A Human Action Recognition Method Based on Lie Group Features and Convolutional Neural Networks
  • A Human Action Recognition Method Based on Lie Group Features and Convolutional Neural Networks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0051] figure 1 It is the overall framework of the method for human action recognition based on Lie group features and convolutional neural networks according to the present invention, such as figure 1 As shown, the main work of the recognition method of the present invention is to obtain the skeleton information of the human body motion sequence through the somatosensory device Kinect produced by Microsoft, and use a rigid limb transformation (such as rotation, translation, etc. in three-dimensional space) to simulate the movement of the human body. The Lie group skeleton representation method of relative three-dimensional geometric relationship models the human action as a series of curves on the Lie group, and then combines the correspondence between Lie groups and Lie algebras, such as image 3 , and use logarithmic mapping to map the curve based on Lie group space to the curve based on Lie algebra space. Finally, the Lie group feature and the convolutional neural network...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a human action recognition method based on Lie group feature and convolutional neural network, and belongs to the field of computer pattern recognition. The method includes: S1: data acquisition, using the Microsoft somatosensory device Kinect to extract human skeleton information, and acquiring the motion information of the experimenter; S2: extracting Lie group features, adopting a rigid limb transformation to simulate the relative three-dimensional geometry of each limb of the human body The Lie group skeleton representation method of relationship models human actions as a series of curves on the Lie group, and then combines the correspondence between Lie groups and Lie algebras, and uses logarithmic mapping to map the curves based on the Lie group space to the Lie algebra-based space. S3: Feature classification, integrate Lie group features and convolutional neural network, use Lie group features to train convolutional neural network, and let convolutional neural network learn and classify Lie group features, so as to realize human action recognition. The present invention can achieve good identification effect.

Description

technical field [0001] The invention belongs to the field of computer pattern recognition, and relates to a human action recognition method based on Lie group features and a convolutional neural network. Background technique [0002] With the rapid development of science and technology, more natural human-computer interaction has become an increasingly urgent need for people. People are more eager for computers to think and understand external input signals like the human brain, and to understand human daily behaviors. Easily and naturally communicate with the computer. [0003] Human motion recognition refers to a practical technology that takes digital images or video signal streams as objects, and obtains human motion information through image processing and automatic recognition. Due to the variability of human movements, camera movements, changes in light intensity, differences in human body types, and differences in human environmental conditions, the research on huma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06V10/84
CPCG06V40/20G06F18/295
Inventor 蔡林沁丁和恩陆相羽隆涛陈思维
Owner 北京陟锋科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products