Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Behavior recognition method and device based on spatial-temporal feature fusion, equipment and medium

A technology of spatiotemporal features and recognition methods, applied in the field of computer vision, can solve problems such as affecting reasoning speed, large amount of 3D convolution calculation, and difficult behavior recognition, so as to reduce the amount of calculation, maintain recognition performance, ensure recognition accuracy and real-time performance. sexual effect

Pending Publication Date: 2021-05-25
SHENZHEN XINYI TECH CO LTD
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the large amount of calculation of 3D convolution, which affects the speed of reasoning, it is difficult to achieve online real-time behavior recognition in practical applications.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior recognition method and device based on spatial-temporal feature fusion, equipment and medium
  • Behavior recognition method and device based on spatial-temporal feature fusion, equipment and medium
  • Behavior recognition method and device based on spatial-temporal feature fusion, equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0058] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, and are not intended to limit the present application.

[0059] Aiming at the problems existing in the prior art, the embodiment of the present invention provides a behavior recognition method based on spatio-temporal feature fusion, such as figure 1As shown, the method includes the following steps:

[0060] Acquiring video frames to be processed, and unifying the sizes of the video frames to be processed;

[0061] Extracting shallow features in the video frame to be processed;

[0062] Extracting deep features in the video frame to be processed according to the shallow features;

[0063] Extracting the spatio-temporal 2D fea...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a behavior recognition method and device based on spatio-temporal feature fusion, equipment and a medium, and the method comprises the steps: obtaining a to-be-processed video frame, and unifying the size of the to-be-processed video frame; extracting shallow layer features in the video frame to be processed; according to the shallow layer features, extracting deep layer features in the video frame to be processed; extracting a space-time 2D feature layer in the to-be-processed video frame; and according to the deep features and the space-time 2D feature layer, identifying a behavior category of a target object in the to-be-processed video frame. According to the method and device, 3D convolution is replaced with a 2D convolution mode, the calculation amount of the network can be effectively reduced, meanwhile, the recognition performance can be kept in behavior recognition, the recognition accuracy and the recognition real-time performance are guaranteed, and the method and device can be widely applied to the technical field of computer vision.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a behavior recognition method, device, equipment and medium based on spatio-temporal feature fusion. Background technique [0002] Behavior recognition is an important field in the field of computer vision. Its main task is to automatically analyze the ongoing behavior of the target through video. It plays an important role in video surveillance and monitoring, robot interaction, etc. [0003] With the continuous development of deep learning, the performance of video understanding and behavior analysis has been greatly improved, and behavior recognition technology has also achieved significant development. The current mainstream behavior recognition methods are divided into two-stream methods, recognition methods based on human skeletons, and methods based on 3D convolutional networks. The dual-stream method includes two aspects of video frame RGB image and optical flow ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08G06T7/40
CPCG06N3/049G06N3/084G06T7/40G06V20/41G06V20/46G06V10/462G06V10/44G06N3/044G06N3/045G06F18/253
Inventor 梁添才蔡德利赵清利徐天适王乃洲
Owner SHENZHEN XINYI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products