Human body behavior recognition method based on multi-class spectrograms and composite convolutional neural network

A neural network and recognition method technology, applied in the field of human behavior recognition, can solve the problems of incomplete feature extraction, low efficiency of manual feature extraction, single feature expression, etc., and achieve the effect of overcoming low feature efficiency

Pending Publication Date:
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to provide a method that overcomes the problem of single feature expression based on single time-frequency analysis and incomplete feature extraction based on single network in the current human behavior recognition method, and the problem of low efficiency of manual feature extraction, which can fully extract a single feature. The independent spatial features in the spectrogram and the correlation features between different spectrograms can improve the accuracy of discrimination. A human behavior recognition method based on multi-class spectrograms and composite convolutional neural networks

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior recognition method based on multi-class spectrograms and composite convolutional neural network
  • Human body behavior recognition method based on multi-class spectrograms and composite convolutional neural network
  • Human body behavior recognition method based on multi-class spectrograms and composite convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0055] Embodiment 1: see Figure 1-Figure 3 , a human behavior recognition method based on a multi-class spectrogram and a composite convolutional neural network, comprising the following steps;

[0056] (1) Sampling different human behaviors n times by stepping frequency continuous wave radar, recording the human behavior category of each sampling, and obtaining a human behavior sample each time, the sample is a matrix of N×M, where N is the number of pulse periods, M is the number of stepping frequencies in a pulse period, and a data set X is obtained by sampling n times,

[0057] X={X i ∈ R N×M |i=1,2,...,n}

[0058] The R is a complex number, N×M is the matrix dimension, X i is the i-th sample in X; the human behavior categories include jumping, picking up things, running, squatting, stepping, arm swinging, throwing and walking;

[0059] (2) Each X i Select S column vectors in , and get n×S column vectors;

[0060] (3) Construct a spectrogram sample data set, includ...

Embodiment 2

[0076] Example 2: see Figure 1 to Figure 7 , in order to better illustrate the effect of the present invention, we specifically set as follows:

[0077] (1) Before performing different human behaviors through step frequency continuous wave radar, the following experimental settings are determined first:

[0078] Radar parameters: the radar frequency range is 1.6GHz to 2.2GHz, the power is 20dBm, the pulse period is 30ms, and in one pulse period, the initial frequency is f 0 , the number of steps is 300 times, and the step frequency increment is 2MHz, that is to say: the pulse period △t=30ms, within one pulse period, the number of step frequencies is N=301, and the frequency is from f 0 , f 0 +2MHz, ..., f 0 +300×2MHz changes in turn, and within 30ms of a pulse period, the radar frequency completes a frequency cycle from 1.6GHz to 2.2GHz. In addition, in this embodiment, a stepped frequency continuous wave radar with two transmissions and four receptions is adopted.

[00...

Embodiment 3

[0090] Embodiment 3: In the present embodiment, the short-time Fourier transform of step (31), the Hanning kernel reduces cross-term interference distribution, and the smooth pseudo-Wigner distribution carries out time-frequency analysis specifically as follows:

[0091]

[0092] In the formula, STFTs(k, f) means: the result of short-time Fourier transform on the original signal; s(·) means: the original signal; k means: time variable; e is a natural constant; f is frequency; w(· ) represents the window function; u represents the convolution variable.

[0093]

[0094] Among them, R s for:

[0095]

[0096] RIDHKs (k, f) represents: carry out Hanning kernel of the present invention to the original signal and reduce the result of cross-term interference distribution transformation;

[0097] s( ) means: original signal;

[0098] R s (k, τ) means: the result of adding a time window to the original signal;

[0099] h(τ) means: frequency window function;

[0100] f m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human body behavior recognition method based on a multi-class spectrogram and a composite convolutional neural network. The method comprises the steps: sampling different human body behaviors through a stepping frequency continuous wave radar, and obtaining a matrix in each sampling; selecting a plurality of column vectors for each matrix, and constructing a spectrogram sample data set by adopting three different time-frequency analysis methods; constructing a composite convolutional neural network and training to obtain a classification model. According to the method, multiple types of spectrograms with different feature expressions are used as input data, so that expressions about human behavior features are diversified and comprehensive; when the composite convolutional neural network is used for feature extraction, independent spatial features in a single spectrogram and associated features among different spectrograms can be fully extracted, so that feature extraction is sufficient and effective; the two features are fused and then sent to the classifier for classification, so that the classifier can fully utilize different types of features, and the discrimination accuracy is improved.

Description

technical field [0001] The invention relates to a human behavior recognition method, in particular to a human behavior recognition method based on multi-class spectrograms and composite convolutional neural networks. Background technique [0002] At present, human behavior recognition can obtain data through video surveillance, infrared cameras, wearable devices and other methods. Radar can overcome the shortcomings of many other methods, and has strong penetrating ability and good privacy protection, so it is widely used. Usually, time-frequency analysis is performed on the collected radar echo signals to obtain a time-spectrum map with micro-Doppler characteristics, and then identification and classification are carried out by extracting human behavior features in the spectrogram. [0003] Among them, commonly used time-frequency analysis methods include Hilbert-Huang transform, short-time Fourier transform, wavelet transform, Wigner distribution, smooth pseudo-Wigner dis...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/253G06F18/24
Inventor 张博宙唐珑畛钱宇佳贾勇
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products