Behavior recognition method and system based on attention mechanism double-flow network

A recognition method and flow network technology, applied in the field of behavior recognition based on the attention mechanism dual-stream network, can solve the problems of low utilization of video data, affecting the classification accuracy, and the dual-stream network does not consider the weight of image features, so as to suppress irrelevant information. The effect of improving the accuracy of behavior recognition

Inactive Publication Date: 2020-07-28
SHANDONG UNIV
View PDF6 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This technology allows videos captured by cameras or sensors to be analyzed effectively without being affected by external factors such as lighting conditions or other environmental variables that may affect their performance. It achieves this through dividing them up evenly across different parts of space based on time and frequency channels. By doing these techniques, it can extract important visual characteristics from frames and create high-quality animation sequences. Overall, this innovation enhances how people recognize actions accurately due to its ability to analyze large amounts of real world content efficiently over long periods of time.

Problems solved by technology

This patented technical problem addressed in this patents relates to improving performance for analyzing visual content captured through cameras or similar devices like webcams (web). Dual-flow Networks have shown promise but they face challenges with capturing full range view imagery without distortion caused by objects obscured during camera angles. Additionally, there are issues associated with identifying relevant areas within the scene where no target can be detected while also considering how different types of noise contribute to the overall quality of the video signal being processed.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior recognition method and system based on attention mechanism double-flow network
  • Behavior recognition method and system based on attention mechanism double-flow network
  • Behavior recognition method and system based on attention mechanism double-flow network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] Such as figure 1 As shown, Embodiment 1 of the present disclosure provides a behavior recognition method based on an attention mechanism dual-stream network, including the following steps:

[0034] Divide the acquired entire video into multiple video clips of equal length, extract the RGB image and optical flow grayscale image of each frame of each video clip, and perform preprocessing;

[0035] Randomly sampling the preprocessed image to obtain at least one RGB image and at least one optical flow grayscale image of each video clip;

[0036] Using the dual-stream network model that introduces the attention mechanism, the appearance features and temporal dynamic features of the sampled images are extracted, and the extracted features of each video clip are fused according to the type of time domain network and space domain network, and the fusion results of the time domain network are Weighted fusion is performed with the fusion result of the airspace network to obtain ...

Embodiment 2

[0064] Embodiment 2 of the present disclosure provides a behavior recognition system based on an attention mechanism dual-stream network, including:

[0065] The data acquisition module is configured to: divide the acquired whole video into a plurality of video clips of equal length, extract RGB images and optical flow grayscale images of each frame of each video clip, and perform preprocessing;

[0066] The image sampling module is configured to: randomly sample the preprocessed image to obtain at least one RGB image and at least one stacked optical flow grayscale image of each video segment;

[0067] The behavior recognition module is configured to: use the dual-stream network model that introduces the attention mechanism to extract the appearance features and time dynamic features of the sampled images, and fuse the extracted features of each video segment according to the time domain network and the space domain network type. , the fusion result of the time domain network ...

Embodiment 3

[0070] Embodiment 3 of the present disclosure provides a medium on which a program is stored, and when the program is executed by a processor, the steps in the behavior recognition method based on the attention mechanism dual-stream network as described in Embodiment 1 of the present disclosure are implemented.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a behavior recognition method and system based on an attention mechanism double-flow network, and belongs to the technical field of behavior recognition, and the method comprises the steps: dividing an obtained whole video segment into a plurality of video segments with the same length, extracting an RGB image and an optical flow gray-scale image of each frame of each videosegment, and carrying out the preprocessing of the RGB images and the optical flow gray-scale images; carrying out random sampling on the preprocessed image to obtain an RGB image and an optical flowgrayscale image of each video clip; extracting appearance features and time dynamic features of the sampled images by using a double-flow network model introducing an attention mechanism, fusing the appearance features and the time dynamic features according to the types of a time domain network and a space domain network respectively, and performing weighted fusion on a fusion result of the timedomain network and a fusion result of the space domain network to obtain an identification result of the whole video. According to the invention, the video data can be fully utilized, the local key features of the video frame image can be better extracted, the foreground area where the action occurs is highlighted, the influence of irrelevant information in the background environment is inhibited,and the behavior recognition accuracy is improved.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products