Three-branch network behavior identification method based on multipath space-time feature enhanced fusion

A technology of spatio-temporal features and recognition methods, applied in neural learning methods, character and pattern recognition, biological neural network models, etc., can solve problems such as complex backgrounds, wearing clothes, etc., to increase effective information utilization, improve effects, and maximize The effect of interaction

Active Publication Date: 2020-09-25
JIANGNAN UNIV
View PDF3 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Behavior recognition can be applied to scenes such as human-computer interaction, medical monitoring, and video intelligent monitoring. However, due to the influ

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-branch network behavior identification method based on multipath space-time feature enhanced fusion
  • Three-branch network behavior identification method based on multipath space-time feature enhanced fusion
  • Three-branch network behavior identification method based on multipath space-time feature enhanced fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] In order to better illustrate the present invention, the public behavioral data set UCF101 is taken as an example below. In this example, k=3 is used to segment the entire video, that is, three time-segmented networks are used, and in each network Select M=3 layer features, k and M can be adjusted according to the actual situation in the specific implementation.

[0036] figure 2 It is the overall model diagram after dividing the entire video into three sections in chronological order;

[0037] image 3 It is an overall model diagram (single time segment) of the present invention;

[0038] image 3 The algorithm model diagram of the present invention representing a single time segment, combined with figure 2It can represent the complete algorithm flow chart in the present invention. The algorithm uses RGB pictures and corresponding continuous optical flow pictures as input, wherein the RGB frames obtained from the video are input into the spatial flow network, and...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-branch network behavior identification method based on multipath space-time feature enhanced fusion. The method adopts a network framework based on a space-time double-flow network, and the network framework is called as a multipath space-time feature enhanced fusion network. The method aims at solving the problems that double-flow information is not fully utilizeddue to the fact that a double-flow network only fuses top-layer space-time features, and feature fusion interaction is insufficient due to the fact that a feature fusion stage is located behind a global sampling layer. According to the method, a compression bilinear algorithm is utilized to perform dimension reduction on multi-layer corresponding spatial-temporal features from a double-flow network, and then fusion is performed, so that the interaction between fusion features is increased and the fusion effect is enhanced while the memory required by the fusion features is reduced. Besides, amulti-scale channel-space attention module is provided in the fusion flow, effective features in the fusion features are enhanced, and invalid features are suppressed. Finally, long-term time information in the video is captured in combination with the thought of a time segment network TSN, and the robustness of the behavior recognition model is further improved.

Description

technical field [0001] The invention belongs to the field of machine vision, and in particular relates to a three-tributary network behavior recognition method based on multipath spatio-temporal feature enhanced fusion. Background technique [0002] With the development of society, more and more knowledge in the field of machine vision is applied to real life, and behavior recognition is an important research direction in the field of machine vision. Behavior recognition can be applied to scenes such as human-computer interaction, medical monitoring, and video intelligent monitoring. However, due to the influence of factors such as lighting conditions, object occlusion, complex background, and wearing clothes, behavior recognition still has many problems to be solved. Currently existing behavior recognition methods are mainly (1) based on RGB video; (2) based on skeletal nodes; (3) based on RGB+D video. Since there are many ways to acquire RGB video data and the acquisition...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/045G06F18/256G06F18/253
Inventor 孔军邓浩阳蒋敏
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products