Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A human action recognition method based on multi-sensor data fusion

A technology for human action recognition and data fusion, which is applied in character and pattern recognition, instruments, computing, etc., and can solve problems such as single classifier decision error.

Active Publication Date: 2021-06-25
DALIAN UNIV OF TECH
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention mainly solves technical problems such as the single classification algorithm in the prior art cannot be applied to the recognition of all human actions, and the single classifier has certain decision-making errors, etc., and proposes a human action recognition method based on multi-sensor data fusion, which can effectively To overcome the disadvantages of single classifier in the recognition process, the recognition result obtained by using the layered fusion model proposed by the present invention is obviously better than the traditional recognition method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A human action recognition method based on multi-sensor data fusion
  • A human action recognition method based on multi-sensor data fusion
  • A human action recognition method based on multi-sensor data fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In order to make the technical problems solved by the present invention, the technical solutions adopted and the technical effects achieved clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention. In addition, it should be noted that, for the convenience of description, only parts related to the present invention are shown in the drawings but not all content.

[0063] figure 1 It is a flowchart of the realization of the human action recognition method based on multi-sensor data fusion in the present invention. Such as figure 1 As shown, the human action recognition method based on multi-sensor data fusion provided by the embodiment of the present invention, the specific process is as follows:

[0064] Step 100, using N inertial sensor ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to the field of human motion recognition, and provides a human motion recognition method based on multi-sensor data fusion, which includes: using N inertial sensor nodes respectively fixed on different parts of the human body to collect human motion data; using sliding window segmentation technology for each The human body action data collected by each sensor node is divided into windows to obtain multiple action data segments of each sensor node; the feature extraction is performed on the action data segments of each sensor node to obtain the corresponding feature vector; the RLDA algorithm is used to obtain the obtained The feature vector of each sensor node is subjected to feature dimensionality reduction; the feature vector after dimensionality reduction of each sensor node is used as training data for parameter training and modeling, and the corresponding layered fusion model is obtained; using the obtained layered fusion model, the Human action recognition. The invention can effectively overcome the drawbacks of the single classifier in the recognition process, and can effectively improve the recognition accuracy of human body actions.

Description

technical field [0001] The invention relates to the field of human motion recognition, in particular to a human motion recognition method based on multi-sensor data fusion. Background technique [0002] Human action recognition technology is a new human-computer interaction method that has emerged in recent decades, and it has gradually become a hot issue studied by scholars at home and abroad. Human body movements mainly refer to the way the human body moves, as well as people's responses to the environment or objects. The human body describes or expresses complex human movements through complex movements of the limbs. It can be said that most of the actions of the human body need to be reflected through the movement of the body limbs. It becomes a very effective way to analyze the human body's motion by studying and exploring the human body's motion. Human action recognition based on inertial sensors is an emerging research field in the field of pattern recognition. Its ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/46G06K9/62
CPCG06V40/20G06V10/267G06V10/40G06F18/213G06F18/25
Inventor 王哲龙郭明王英睿赵红宇仇森
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products