Deep convolutional adversarial neural network-based human body action radar image classification method

A radar image and neural network technology, applied in the fields of instruments, character and pattern recognition, computer parts, etc., can solve the problems such as the lack of a unified and effective framework in the research field of human action and behavior recognition, and the analysis and classification methods of related technologies, and achieve the classification accuracy. Improve and improve the effect of accuracy

Inactive Publication Date: 2018-10-16
TIANJIN UNIV
View PDF3 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These two problems make the research field of human action behavior recognition lack a unified and effective framework, related technologies and unified and effective analysis and classification methods.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Deep convolutional adversarial neural network-based human body action radar image classification method
  • Deep convolutional adversarial neural network-based human body action radar image classification method
  • Deep convolutional adversarial neural network-based human body action radar image classification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] In order to make the technical solution of the present invention clearer, the specific implementation manners of the present invention will be further described below. The present invention is concretely realized according to the following steps:

[0023] 1. Radar time-frequency image dataset construction

[0024] The present invention uses the MOCAP data set established by the Graphics Laboratory of Carnegie Mellon University. The dataset collects data based on the human ellipsoid motion model, which is derived from the Boulic human gait model, a global human gait model proposed by Boulic in 1990. This model models the human target echo and can divide the human body into two parts: Ten scattering parts are head, chest cavity, left upper arm, right upper arm, left forearm, right forearm, left thigh, right thigh, left calf and right calf. Different limb movements have different motion curve equations, and the echo form of the human body is the sum of all the different ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a deep convolutional adversarial neural network-based human body action radar image classification method. The method comprises the steps of constructing a data set; realizingradar image data enhancement through a DCGAN: establishing the DCGAN, using the network to individually learn each radar spectrogram, generating new radar spectrograms according to characteristics learned by the network, expanding training set samples under the condition of a certain data amount, adjusting parameters by the network to ensure that fewest failure images are generated, and expandingthe data set to the maximum extent, thereby realizing the data enhancement; and extracting upper, middle and lower envelopes in each radar image to serve as eigenvectors, taking the three eigenvectors as inputs of a support vector machine classifier, and classifying radar image data by utilizing a support vector machine, wherein the upper and lower envelopes represent echo radial velocities of human limbs, the middle envelope represents an echo radial velocity of a human trunk.

Description

technical field [0001] The invention belongs to the fields of human action recognition, radar target detection, data enhancement, deep convolutional generative adversarial networks (DCGAN, Deep Convolutional Generative Adversarial Networks) and machine learning, and relates to the feature extraction of radar images and the use of DCGAN for data enhancement and The problem of human action classification. Background technique [0002] Human action recognition [1] is a research hotspot in the field of computer vision in recent years, and it is widely used in the fields of human-computer interaction, virtual reality, and video surveillance. , but the high complexity and variability of human motion make the recognition efficiency and accuracy unable to fully meet the relevant requirements of various industries. The difficulty of human action behavior recognition mainly lies in the complexity of space and the difference of time. Space complexity includes action scenes with diffe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/23G06F18/24323
Inventor 侯春萍徐金辰杨阳郎玥
Owner TIANJIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products