Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A human motion recognition method based on multi-sensor data fusion

A technology of human action recognition and data fusion, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as single classifier decision error, and achieve the goal of improving robustness, reducing huge disturbance, and improving algorithm performance. Effect

Active Publication Date: 2018-12-25
DALIAN UNIV OF TECH
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention mainly solves technical problems such as the single classification algorithm in the prior art cannot be applied to the recognition of all human actions, and the single classifier has certain decision-making errors, etc., and proposes a human action recognition method based on multi-sensor data fusion, which can effectively To overcome the disadvantages of single classifier in the recognition process, the recognition result obtained by using the layered fusion model proposed by the present invention is obviously better than the traditional recognition method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A human motion recognition method based on multi-sensor data fusion
  • A human motion recognition method based on multi-sensor data fusion
  • A human motion recognition method based on multi-sensor data fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In order to make the technical problems solved by the present invention, the technical solutions adopted and the technical effects achieved clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention. In addition, it should be noted that, for the convenience of description, only parts related to the present invention are shown in the drawings but not all content.

[0063] figure 1 It is a flowchart of the realization of the human action recognition method based on multi-sensor data fusion in the present invention. Such as figure 1 As shown, the human action recognition method based on multi-sensor data fusion provided by the embodiment of the present invention, the specific process is as follows:

[0064] Step 100, using N inertial sensor ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of human body motion recognition, and provides a human body motion recognition method based on multi-sensor data fusion. The method comprises the following steps: human body motion data are collected by using N inertial sensor nodes respectively fixed at different parts of the human body; sliding window segmentation technology is used to segment the human motiondata collected from each sensor node to obtain multiple motion data segments of each sensor node; feature extraction is carried out on the motion data segment of each sensor node to obtain the corresponding feature vector; an RLDA algorithm is used to reduce the dimension of eigenvectors of each sensor node; the reduced dimension eigenvector of each sensor node is used as training data for parameter training and modeling, and the corresponding hierarchical fusion model is obtained; using the hierarchical fusion model, human motion recognition is carried out. The invention can effectively overcome the drawbacks of the single classifier in the identification process, and can effectively improve the human motion identification accuracy.

Description

technical field [0001] The invention relates to the field of human motion recognition, in particular to a human motion recognition method based on multi-sensor data fusion. Background technique [0002] Human action recognition technology is a new human-computer interaction method that has emerged in recent decades, and it has gradually become a hot issue studied by scholars at home and abroad. Human body movements mainly refer to the way the human body moves, as well as people's responses to the environment or objects. The human body describes or expresses complex human movements through complex movements of the limbs. It can be said that most of the actions of the human body need to be reflected through the movement of the body limbs. It becomes a very effective way to analyze the human body's motion by studying and exploring the human body's motion. Human action recognition based on inertial sensors is an emerging research field in the field of pattern recognition. Its ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/34G06K9/46G06K9/62
CPCG06V40/20G06V10/267G06V10/40G06F18/213G06F18/25
Inventor 王哲龙郭明王英睿赵红宇仇森
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products