A two-stream network action recognition method based on spatio-temporal saliency action attention

An attention and sexual behavior technology, applied in character and pattern recognition, biological neural network models, instruments, etc., can solve problems such as difficulty in ensuring the activity and availability of effective information, improve behavior recognition efficiency, reduce storage pressure, and implement Efficient effect

Active Publication Date: 2020-12-15
JIANGNAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional two-stream network still faces the following problems: (1) How to make full use of the temporal semantic information of consecutive frames under the premise of effectively controlling the model complexity
(2) The network directly extracts the features on each frame, and it is difficult to guarantee the activity and availability of effective information in the network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A two-stream network action recognition method based on spatio-temporal saliency action attention
  • A two-stream network action recognition method based on spatio-temporal saliency action attention
  • A two-stream network action recognition method based on spatio-temporal saliency action attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] figure 2 It represents the algorithm model diagram of the present invention. The algorithm takes RGB keyframes and optical flow keyframes as input, and the model includes six key parts: salient behavior detection network, attention network, spatial network, temporal network, classification and fusion. The spatial network uses a bidirectional LSTM architecture, while the temporal network uses a C3D architecture. Finally, the weighted average fusion method is used to fuse the two networks, and the default fusion weights of the two streams are 0.5 respectively.

[0032] In order to better illustrate the present invention, the following takes the public behavior data set UCF101 as an example.

[0033] The data processing method of the key frame mechanism in step 3 in the above technical solution is:

[0034] Traditional behavior recognition methods usually take frames at random or segmented frames. The present invention introduces a video summarization method to extract ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A dual-stream network behavior recognition method based on spatiotemporal saliency behavior attention belongs to the field of machine vision. The method employs a spatio-temporal two-stream network-based network architecture called the spatiotemporal saliency behavioral attention network ST‑SAMANet. Aiming at the problem of large memory consumption and excessive redundant information caused by direct input of RGB frames and optical flow frames in the traditional dual-stream network, the present invention introduces a key frame mechanism to obtain the maximum difference between frames and significantly reduce the memory consumption of the temporal network. In addition, in the network, there are a large number of feature redundancy and background disturbance on the frame, which greatly affects the performance of the network. The present invention introduces Mask R-CNN technology into the network, pays close attention to the human bodies and objects in each behavior category, and performs feature extraction on the salient regions on each frame. Finally, the two-way LSTM and C3D network are used to encode space-time respectively to obtain perfect space-time information, which improves the robustness of the behavior recognition model.

Description

technical field [0001] The invention belongs to the field of machine vision, in particular to a dual-stream network behavior recognition method based on spatio-temporal salient behavior attention. Background technique [0002] With the extensive research of machine vision in theory and practice, behavior recognition has gradually become an important branch of it. Due to the diversity of objective environments and the subjective complexity of human behavior, there are still many problems to be solved in human behavior recognition. At present, behavior recognition is mainly divided into two types of methods based on static pictures and based on videos. For a long time before video research became popular, most behavior recognition research was based on images, but the information provided by images was very limited, and it was not easy to capture effective information for recognition. In comparison, for behavior recognition on videos, researchers can extract sufficient 'acti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/40G06N3/045G06F18/23213
Inventor 蒋敏潘娜孔军
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products