Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human motion state discrimination method based on densely connected convolutional neural network

A convolutional neural network and dense connection technology, applied in the field of human motion state discrimination, can solve unfavorable machine learning and other problems, achieve the effect of reducing the number of supporting measurement equipment and the number of features, low complexity, and reduced complexity

Active Publication Date: 2019-10-15
BEIHANG UNIV
View PDF1 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Previous Gait Analysis Based on Video or Image Sequence (Ben Yuye, Xu Sen, Wang Kejun. A Review of Pedestrian Gait Feature Expression and Recognition. Pattern Recognition and Artificial Intelligence, 2012,25(1):71-81.(BEN X Y ,XU S,WANG K J,et al.Review on Pedestrian Gait Feature Expression and Recognition.PatternRecognition and Artificial Intelligence,2012,25(1):71-81.)), vulnerable to scene lighting changes, moving target occlusion, etc. Influenced by factors, it is not conducive to subsequent machine learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human motion state discrimination method based on densely connected convolutional neural network
  • Human motion state discrimination method based on densely connected convolutional neural network
  • Human motion state discrimination method based on densely connected convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0024] According to an embodiment of the present invention, a motion intention recognition method based on a dynamic time series of a moving object is proposed. At the same time, the acceleration, angular velocity information and plantar pressure information of the left and right legs and waist of the moving object are collected, which are combined to divide the movement phase and distinguish the movement state. Afterwards, a motion state discrimination method is proposed, which is realized by constructing a densely connected convolutional neural network.

[0025] The following specifically introduces the motion state discrimination method flow process based on the gait information convolutional neural network provided by the present invention, and its steps include:

[0026] 1. Gait data collection: The method of combining the plant...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a gait data acquisition mode based on multi-dimensional information fusion and a motion state discrimination method based on a densely connected convolutional neural network. Inthe gait data acquisition process of multi-dimensional information fusion, acceleration and angular velocity information and plantar pressure information of a left shank (LS), a right shank (RS) anda waist (L5) of an experimental subject during straight going, left turning, right turning and stair climbing are measured at the same time, the operability is high, and the complexity is low. According to the densely connected convolutional neural network for the original gait sequence, feature extraction is not needed any more, priori knowledge is not needed, meanwhile, feature selection is matched, the number of needed matched measurement devices and the number of needed features are reduced, and the complexity of a network model is reduced. And the input of the next layer is formed by connecting the input feature map and the output feature map of the convolution layer of the middle part, so that the accuracy of the network is obviously improved. The accuracy of motion state judgment (straight going, left turning and right turning) of single-person multi-tasks (straight going, left turning and right turning) on the self-collection data set can reach 99.1%, and compared with 91.79% of an SVM, the accuracy is improved to some extent.

Description

technical field [0001] The invention provides a human body motion state discrimination method based on a densely connected convolutional neural network, which provides a multi-dimensional information fusion gait information collection method, and provides a new analysis method for motion state discrimination oriented to gait information, belonging to Human gait recognition and pattern recognition field. Background technique [0002] The human gait recognition technology aims to analyze the gait acceleration data of the moving object and realize the qualitative judgment of the moving object's gait. Previous Gait Analysis Based on Video or Image Sequence (Ben Yuye, Xu Sen, Wang Kejun. A Review of Pedestrian Gait Feature Expression and Recognition. Pattern Recognition and Artificial Intelligence, 2012,25(1):71-81.(BEN X Y ,XU S,WANG K J,et al.Review on Pedestrian Gait Feature Expression and Recognition.PatternRecognition and Artificial Intelligence,2012,25(1):71-81.)), vulnera...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/25G06F18/241
Inventor 张斌刘宇李阳
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products