Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Radar-simulation-image-based human body motion classification method of a convolution neural network

A convolutional neural network and human motion technology, applied in the field of radar target classification and deep learning, can solve problems such as partial occlusion, individual differences in human body, and multi-person recognition objects

Active Publication Date: 2017-09-15
TIANJIN UNIV
View PDF2 Cites 51 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, conditions such as different lighting, viewing angles, and backgrounds can produce differences in poses and characteristics of the same human action.
In addition, there are still problems such as human body self-occlusion, partial occlusion, individual differences in human body, and multi-person recognition objects. These are the bottlenecks that the existing human motion classification schemes based on vision methods are difficult to break through.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Radar-simulation-image-based human body motion classification method of a convolution neural network
  • Radar-simulation-image-based human body motion classification method of a convolution neural network
  • Radar-simulation-image-based human body motion classification method of a convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] In order to make the technical solution of the present invention clearer, the specific implementation manners of the present invention will be further described below. The present invention is concretely realized according to the following steps:

[0020] 1. Radar time-frequency image dataset construction

[0021] (1) Radar image simulation based on MOCAP dataset

[0022] The motion capture (MOCAP) data set was established by the Graphics Lab of CMU, using the Vicon motion capture system to capture real motion data. The system consists of 12 MX-40 infrared cameras, each with a frame rate of 120Hz, and can record subjects The 41 marker points on the subject's body can be used to obtain the movement trajectory of the subject's bones by integrating the images recorded by different cameras. The data set contains 2605 sets of experimental data. During this experiment, seven common actions were selected to generate radar images. These seven actions are: running, walking, ju...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a radar-simulation-image-based human body motion classification method of a convolution neural network. The method comprises: step one, establishing a time-frequency image data set containing a variety of human motions; step two, enhancing radar time-frequency image data; step three, establishing a convolution neural network model; to be specific, using a handwriting recognition network LeNet as a basis, introducing a modified linear unit ReLU to replace an original Sigmoid activation function as an activation function of the convolution network based on three convolution layers, two pool layers and two fully connection layers, adding a pool layer and reducing a fully connection layer to form a convolution neural network structure including three convolution layers, three pool layers and one fully connection layers, and adjusting an interlamination structure and an in-layer structure of the network and training parameters to realize a good classification effect; and training the convolution neural network model.

Description

technical field [0001] The invention belongs to the field of radar target classification and deep learning, and relates to the problem of using radar to classify human actions. Background technique [0002] In the process of interacting with the outside world, in addition to communicating through voice, people often use body language, that is, convey information through actions. Human action classification has a wide range of application scenarios in many fields, such as intelligent monitoring, human-computer interaction, virtual reality, somatosensory games, medical monitoring, etc. Most of the current research on human action recognition focuses on vision-based recognition. The core is to process and analyze the original image or image sequence data collected by the sensor through the computer, and learn and understand the human's actions and actions. However, conditions such as different lighting, viewing angles, and backgrounds will produce differences in poses and char...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/08
CPCG06N3/084G06V40/20G06F18/2193
Inventor 侯春萍郎玥杨阳黄丹阳何元
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products