Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Double-flow network behavior recognition method based on space-time significant behavior attention

A technology of attention and sexual behavior, applied in character and pattern recognition, biological neural network models, instruments, etc., can solve problems such as difficulty in ensuring the activity and availability of effective information, improve behavior recognition efficiency, reduce storage pressure, and execute high efficiency effect

Active Publication Date: 2019-12-13
JIANGNAN UNIV
View PDF5 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the traditional two-stream network still faces the following problems: (1) How to make full use of the temporal semantic information of consecutive frames under the premise of effectively controlling the model complexity
(2) The network directly extracts the features on each frame, and it is difficult to guarantee the activity and availability of effective information in the network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Double-flow network behavior recognition method based on space-time significant behavior attention
  • Double-flow network behavior recognition method based on space-time significant behavior attention
  • Double-flow network behavior recognition method based on space-time significant behavior attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] figure 2 It represents the algorithm model diagram of the present invention. The algorithm takes RGB keyframes and optical flow keyframes as input, and the model includes six key parts: salient behavior detection network, attention network, spatial network, temporal network, classification and fusion. The spatial network uses a bidirectional LSTM architecture, while the temporal network uses a C3D architecture. Finally, the weighted average fusion method is used to fuse the two networks, and the default fusion weights of the two streams are 0.5 respectively.

[0032] In order to better illustrate the present invention, the following takes the public behavior data set UCF101 as an example.

[0033] The data processing method of the key frame mechanism in step 3 in the above technical solution is:

[0034] Traditional behavior recognition methods usually take frames at random or segmented frames. The present invention introduces a video summarization method to extract ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a double-flow network behavior recognition method based on space-time significant behavior attention, and belongs to the field of machine vision. The method adopts a network architecture based on a space-time double-flow network, and the network architecture is called as a space-time significant behavior attention network ST-SAMANet. For the problems of large memory consumption and excessive redundant information caused by directly inputting RGB frames and optical flow frames in a traditional double-flow network, a key frame mechanism is introduced to obtain the maximum difference between frames, and the memory consumption of a time network is remarkably reduced. Besides, in the network, a large number of feature redundancy and background disturbance exist on frames, and the performance of the network is greatly influenced. Mask R-CNN technology is introduced into a network, a human body and an object in each behavior category are highly concerned, and featureextraction is carried out on a salient region on each frame. Finally, the bidirectional LSTM and the C3D network are used to encode the space-time to obtain complete space-time information, and the robustness of the behavior recognition model is improved.

Description

technical field [0001] The invention belongs to the field of machine vision, in particular to a dual-stream network behavior recognition method based on spatio-temporal salient behavior attention. Background technique [0002] With the extensive research of machine vision in theory and practice, behavior recognition has gradually become an important branch of it. Due to the diversity of objective environments and the subjective complexity of human behavior, there are still many problems to be solved in human behavior recognition. At present, behavior recognition is mainly divided into two types of methods based on static pictures and based on videos. For a long time before video research became popular, most behavior recognition research was based on images, but the information provided by images was very limited, and it was not easy to capture effective information for recognition. In comparison, for behavior recognition on videos, researchers can extract sufficient 'acti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V20/40G06N3/045G06F18/23213
Inventor 蒋敏潘娜孔军
Owner JIANGNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products