Behavior recognition method based on space-time attention enhancement feature fusion network

A feature fusion and attention technology, applied in character and pattern recognition, biological neural network models, instruments, etc., can solve the problems of ineffective use of different branches, feature overfitting, etc., and improve feature overfitting and classification The effect of ability improvement

Active Publication Date: 2020-09-25
JIANGNAN UNIV
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional two-stream network trained jointly with RGB appearance flow and optical flow motion flow still faces the following problems: (1) Simple fusion of features or scores obtained f

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior recognition method based on space-time attention enhancement feature fusion network
  • Behavior recognition method based on space-time attention enhancement feature fusion network
  • Behavior recognition method based on space-time attention enhancement feature fusion network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] figure 2 It represents the algorithm model diagram of the present invention. The algorithm takes RGB frames and optical flow frames as input, and performs joint judgment through three branches: RGB appearance flow, optical flow motion flow, and attention-enhanced multi-layer feature fusion flow. The feature fusion flow passes through the multi-layer feature fusion block MFBlock and attention The fusion block AFBlock fuses RGB appearance flow features and optical flow motion flow features. At the same time, a variety of attention modules are added to the three branch networks, namely the input channel attention guidance module ICGA, the high-level channel group attention module HCGA and the timing attention enhancement module TEA for network guidance and feature enhancement. Finally, the classification scores obtained from the three streams are weighted and fused.

[0032] In order to better illustrate the present invention, the following takes the public behavior dat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a behavior recognition method based on a space-time attention enhancement feature fusion network, and belongs to the field of machine vision. According to the method, a networkarchitecture based on an appearance flow and motion flow double-flow network is adopted, and is called as a space-time attention enhancement feature fusion network. Aiming at a traditional double-flow network, simple feature or score fusion is adopted for different branches, an attention-enhanced multi-layer feature fusion flow is constructed to serve as a third branch to supplement a double-flowstructure. Meanwhile, aiming at the problem that the traditional deep network neglects modeling of the channel characteristics and cannot fully utilize the mutual relation between the channels, the channel attention modules of different levels are introduced to establish the mutual relation between the channels to enhance the expression capability of the channel characteristics. In addition, thetime sequence information plays an important role in segmentation fusion, and the representativeness of important time sequence features is enhanced by performing time sequence modeling on the frame sequence. Finally, the classification scores of different branches are subjected to weighted fusion.

Description

technical field [0001] The invention belongs to the field of machine vision, in particular to a behavior recognition method based on spatio-temporal attention enhancement feature fusion network. Background technique [0002] With the extensive research of machine vision in theory and practice, behavior recognition based on RGB video has gradually become a challenging branch. At present, behavior recognition for RGB video mainly uses a dual-stream network architecture, and the development trend is very good. In the two-stream architecture, the deep neural network obtains effective features by training respective deep convolutional networks on the RGB appearance stream and the optical flow motion stream respectively. However, the traditional two-stream network trained jointly with RGB appearance flow and optical flow motion flow still faces the following problems: (1) Simple fusion of features or scores obtained from different branches in the two-stream network cannot effecti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/20G06N3/045G06F18/256G06F18/253
Inventor 蒋敏庄丹枫孔军
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products