Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body motion state classification method based on improved convolutional neural network

A convolutional neural network and human motion technology, applied in neural learning methods, biological neural network models, neural architectures, etc., can solve problems such as network overfitting, optimal design, etc., to improve robustness, reduce parameter volume, The effect of improving training speed and classification accuracy

Pending Publication Date: 2019-07-23
SHANGHAI FIRE RES INST OF MEM
View PDF0 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the conventional convolutional neural network does not optimize the network design for the characteristics of human motion state, so it requires more data set training and more network layers, which easily leads to over-fitting of the network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body motion state classification method based on improved convolutional neural network
  • Human body motion state classification method based on improved convolutional neural network
  • Human body motion state classification method based on improved convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0032] The application principle of the present invention will be described in detail below in conjunction with the accompanying drawings.

[0033] Such as figure 1 As shown, the human body motion state classification method based on the improved convolutional neural network provided by the embodiment of the present invention includes the following steps:

[0034] S101: According to the measured biomechanical experimental data, an empirical model of human walking and running is proposed; and based on the XYZ convention, the Euler rotation matrix is ​​used to establish and improve the human motion animation model of the sta...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention firstly simplifies a human body target motion into a compound motion of a typical motion form, and puts forward a human body walking and running empirical model according to measured biomechanical experiment data. On the basis of an XYZ convention, an Euler rotation matrix is used for establishing and perfecting human body motion animation models under a state of marching on the spotand a walking and running state; then, on the basis of a continuous wave signal, an integral radar echo model of a human body walking and running target and a radar echo model of each position of thetarget are constructed; and a Matlab simulation tool is used for analyzing radar echo characteristics under different human body motion states and generating a corresponding time domain graph throughthe short-time Fourier transform so as to lay a theoretical foundation for the moving human body target characteristic extraction and the walking and running state of the human body. The time domaingraph of the human body motion is divided into a training set, a verification set and a testing set, the improved convolutional neural network is input for training, the network parameter is regulatedto realize model convergence, and the network can correctly classify the human body motion state.

Description

technical field [0001] The invention belongs to the technical field of human body motion state classification based on radar detection, and in particular relates to a human body motion state classification method based on an improved convolutional neural network. Background technique [0002] Human target detection has important applications in anti-terrorism, post-disaster search and rescue and other occasions. Radar-based human motion state classification technology has become a hot spot in recent years. For the human body in motion, the life detection radar uses the micro-Doppler modulation of the radar transmission signal through the swing of the human trunk and limbs, extracts the micro-Doppler characteristics of the echo signal, and realizes the detection of the human body and the identification of the movement state. The modulation process of motion on the echo signal is a non-stationary process, and the limb swing amplitude is much larger than the vibration amplitude...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01S7/41G06N3/04G06N3/08G06K9/62
CPCG01S7/415G01S7/417G06N3/08G06N3/045G06F18/24
Inventor 杨昀李震
Owner SHANGHAI FIRE RES INST OF MEM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products