Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Motion Recognition Method Based on Feature Selection

A motion recognition and feature selection technology, applied in the field of motion recognition, can solve the problems of low motion recognition efficiency and low motion recognition accuracy, and achieve the effect of reducing feature extraction time, reducing feature dimensions, and improving efficiency

Active Publication Date: 2022-04-05
HARBIN INST OF TECH
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problems of low accuracy of motion recognition and low efficiency of motion recognition in existing motion recognition methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Motion Recognition Method Based on Feature Selection
  • A Motion Recognition Method Based on Feature Selection
  • A Motion Recognition Method Based on Feature Selection

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0018] Specific implementation mode one: as figure 1 As shown, a kind of motion recognition method based on feature selection described in this embodiment, the method comprises the following steps:

[0019] Step 1, separately for the M of the human body 0 Each action is used for data collection, and when collecting the data of each action, a three-axis accelerometer, a three-axis gyroscope and a three-axis magnetometer are used simultaneously. M 0 The number of action types contained in an action is N 0 ;

[0020] The collected three-axis accelerometer data, three-axis gyroscope data, three-axis magnetometer data and three-axis attitude angle data (the three-axis attitude angle data is collected by the three-axis gyroscope) as raw data;

[0021] Preprocess the original data to obtain the preprocessed data, and perform action interception on the preprocessed data to obtain action segment data, wherein: each action corresponds to an action segment data;

[0022] Action segme...

specific Embodiment approach 2

[0031] Specific embodiment 2: The difference between this embodiment and specific embodiment 1 is that the statistical features include mean value feature, standard deviation feature, maximum value feature, minimum value feature, median absolute deviation feature, and interquartile range feature and related features;

[0032] The signal time-frequency features include energy average features, signal amplitude area features, signal entropy features, maximum magnitude amplitude features, maximum magnitude frequency features, secondary maximum magnitude amplitude features, secondary maximum magnitude frequency features, average normalization Frequency characteristics, amplitude distribution kurtosis characteristics and amplitude distribution skewness characteristics;

[0033] The complex modeling features include autoregressive coefficient features and Mel frequency cepstrum features.

specific Embodiment approach 3

[0034] Specific implementation mode three: the difference between this implementation mode and specific implementation mode two is: the feature in step two is screened by using the number of division index and the information gain index, and the specific process is as follows:

[0035] Divide times index

[0036] The basic model of the extreme gradient boosting tree model is a classification regression tree. The direction of the data at the branch node is determined by the relationship between its feature value and the threshold value. The number of times a feature is selected to achieve splitting directly reflects the feature in the model decision. A way to evaluate the importance of features is to use the sum of the number of times the feature variable is used for division in all decision tree models as the score;

[0037] Note that the extreme gradient boosting tree model integrates a total of Z decision trees, and the feature value of a certain dimension x i In the decisi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a motion recognition method based on feature selection, which belongs to the technical field of motion recognition. The invention solves the problems of low motion recognition accuracy and low motion recognition efficiency existing in the existing motion recognition method. In order to avoid the overlapping of distinguishing effects between some features, the present invention constructs a set of scientific feature evaluation indicators, selects the best feature combination scheme, and combines the extreme gradient boosting tree algorithm to achieve a motion recognition accuracy of 97.99%. , compared with the existing method, under the simplification scheme of the present invention, the feature types are reduced by 8 types, the feature extraction time is reduced by 2.62%, and the feature dimension is reduced by 20.78%, which effectively improves the efficiency of motion recognition. The invention can be applied to the technical field of motion recognition.

Description

technical field [0001] The invention belongs to the technical field of motion recognition, and in particular relates to a motion recognition method based on feature selection. Background technique [0002] Motion capture systems based on inertial measurement units (IMUs) are the most promising in terms of commercial development, and they can be used almost anywhere, truly realizing motion data acquisition independent of scene constraints. A number of full-body human motion capture systems have been used in the computer graphics and animation industries. Commercial motion capture systems are mainly limited to the size and weight of a single module. In this regard, MEMS gyroscopes and accelerometers have advantages that cannot be compared with conventional inertial sensors such as small size, high integration, suitable for factory production, and cheap and easy to use. Advantages, using the IMU measurement unit based on MEMS components to study the implementation of the motio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): A61B5/11A61B5/00
CPCA61B5/11A61B5/7267A61B2562/0219
Inventor 王奇伊国兴缪若琳孙一为魏振楠
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products