Human behavior recognition method based on attention mechanism and 3D convolutional neural network

A convolutional neural network and recognition method technology, which is applied in the field of human behavior recognition based on attention mechanism and 3D convolutional neural network, can solve the problems of difficulty in showing the essential characteristics of actions, and the impact of recognition results is relatively large, so as to improve the network Recognition accuracy, improve recognition accuracy, and prevent overfitting

Active Publication Date: 2018-11-16
NORTH CHINA ELECTRIC POWER UNIV (BAODING) +1
View PDF4 Cites 81 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method requires manual exploration of features that can express motion, and the features selected manually are sometimes difficult to express the essential characteristics of the action, which has a greater impact on the recognition results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human behavior recognition method based on attention mechanism and 3D convolutional neural network
  • Human behavior recognition method based on attention mechanism and 3D convolutional neural network
  • Human behavior recognition method based on attention mechanism and 3D convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0098] The specific scheme of embodiment 2 is as follows:

[0099] The frame difference channel in this embodiment is calculated using the three-frame difference method, and the calculation process is as follows Figure 7 As shown, by taking the three adjacent frames of images as a group and performing further difference, it can better detect the front and rear change areas of the intermediate frames. The frame difference can describe the difference of human body movements during the movement, and the frame difference matrix describes the area that should be paid attention to in the whole frame cube.

[0100] 1 attention matrix (three-frame difference method) of the present embodiment:

[0101] 1) Select three consecutive frames of images I in the video frame sequence t-1 (x,y), I t (x,y), I t+1 (x, y), respectively calculate the difference D between two adjacent frames of images t-1,t (x,y),D t,t+1 (x,y):

[0102]

[0103] 2) For the obtained difference image, selec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human behavior recognition method based on an attention mechanism and a 3D convolutional neural network. According to the human behavior recognition method, a 3D convolutional neural network is constructed; and the input layer of the 3D convolutional neural network includes two channels: an original grayscale image and an attention matrix. A 3D CNN model for recognizing ahuman behavior in a video is constructed; an attention mechanism is introduced; a distance between two frames is calculated to form an attention matrix; the attention matrix and an original human behavior video sequence form double channels inputted into the constructed 3D CNN and convolution operation is carried out to carry out vital feature extraction on a visual focus area. Meanwhile, the 3DCNN structure is optimized; a Dropout layer is randomly added to the network to freeze some connection weights of the network; the ReLU activation function is employed, so that the network sparsity isimproved; problems that computing load leap and gradient disappearing due to the dimension increasing and the layer number increasing are solved; overfitting under a small data set is prevented; and the network recognition accuracy is improved and the time losses are reduced.

Description

technical field [0001] The invention relates to a human behavior recognition method, in particular to a human behavior recognition method based on an attention mechanism and a 3D convolutional neural network. Background technique [0002] Intelligent video analysis has always been a research field with important academic value. Human behavior recognition, as an indispensable part of this field, has become a new research hotspot. In intelligent video surveillance, advanced human-computer interaction, sports analysis and content-based There are broad application prospects in video retrieval and other aspects. Most of the current mainstream human action recognition methods use artificially designed features to represent human motion in videos, such as contours, silhouettes, HOG, Harris, SIFT, and the extension of these features in three dimensions. Artificially designed features are a great way to take advantage of human intelligence and prior knowledge and apply this knowledg...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/049G06N3/08G06V40/23G06V20/41
Inventor 袁和金牛为华张颖崔克彬
Owner NORTH CHINA ELECTRIC POWER UNIV (BAODING)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products