Human Complex Behavior Recognition Method Based on Multi-Feature Fusion CNN-BLSTM

A multi-feature fusion and recognition method technology, applied in neural learning methods, character and pattern recognition, instruments, etc., can solve the problems of gradient dissipation, inability to fully extract time series data features, gradient explosion, etc., to improve the recognition accuracy, overcome the Effects of long-term dependency problems

Active Publication Date: 2022-04-05
ZHEJIANG UNIV OF TECH
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the problems of manual extraction of features by traditional machine learning methods and incomplete feature extraction by a single deep learning model, as well as the problems of gradient dissipation and gradient explosion in the process of backpropagation of recurrent neural networks, the present invention proposes a method based on one-dimensional convolutional neural network Network and bidirectional long-short-term memory neural network (CNN-BLSTM) human body complex behavior recognition method, this method combines manual extraction features and deep learning model features, can overcome the traditional machine learning and the current single deep learning model cannot fully extract time series data features problems, and overcome the long-term dependence of the recurrent neural network, and improve the accuracy of human complex behavior recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human Complex Behavior Recognition Method Based on Multi-Feature Fusion CNN-BLSTM
  • Human Complex Behavior Recognition Method Based on Multi-Feature Fusion CNN-BLSTM
  • Human Complex Behavior Recognition Method Based on Multi-Feature Fusion CNN-BLSTM

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described below in conjunction with drawings and embodiments.

[0030] refer to Figure 1 ~ Figure 4 , a complex human behavior recognition method based on multi-feature fusion CNN-BLSTM, comprising the following steps:

[0031] Step 1, segmenting continuous sensor data;

[0032] The continuous sensor data is segmented through the sliding window. The process is: select a window size of 200 for 50% data overlapping segmentation. The window size of 200 means that the sensor frequency is 50 Hz, and a total of 4 seconds is collected as an input data. 50% data overlap refers to repeated segmentation, each window size is 200 current input data, and the last 100 data are used as the first 100 data of the next input data.

[0033] Step 2, extracting and selecting features of the segmented sensor data;

[0034] Extract features from the segmented sensor data and perform standardized processing. Through the feature selection algorithm, thi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A complex human behavior recognition method based on multi-feature fusion CNN-BLSTM includes the following steps: segment continuous sensor data through a sliding window, extract features from the segmented sensor data, and use a feature selection algorithm to classify the series of artificial Extract features for screening and retain dominant features; input the segmented behavior data into the deep learning model for training, first perform one-dimensional convolution pooling processing through the convolutional neural network, and then use the average pooling process through the bidirectional long-term and short-term memory neural network The layer extracts the salient features of the state output by the bidirectional long-short-term memory neural network, and finally fuses the pooled feature vector with the previously extracted advantage feature vector as the input feature of the fully connected layer to obtain the output of complex behavior recognition. The invention can fully excavate the characteristics of sensor data and improve the recognition accuracy of complex human behaviors.

Description

technical field [0001] The present invention relates to the fields of data analysis, feature extraction, deep learning, recurrent neural network, behavior recognition, etc., especially to a method for human complex behavior recognition Background technique [0002] Human complex behavior recognition has become a prominent research field at present. The main purpose of complex behavior recognition is to conduct continuous and real-time behavior observation on the behavior occurrence objects. In recent years, the rapid development of the Internet of Things industry, the rapid popularization of smart phones, bracelets, watches, etc., is accompanied by the rapid development and application of various sensors. Compared with the human behavior application of video detection, the sensor-based human behavior recognition method is low in cost, convenient, and has good portability. At present, rich sensing devices have been integrated on smart terminals, such as acceleration sensors,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06V40/20G06V10/764G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V40/20G06N3/044G06N3/045G06F18/241
Inventor 宦若虹葛罗棋吴炜
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products