Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A temporal sequence action recognition method based on deep learning

An action recognition and deep learning technology, applied in the fields of computer vision and pattern recognition, can solve the problems of insufficient long-action feature expression and inaccurate long-action boundary regression, and achieve the effect of improving the recognition rate.

Active Publication Date: 2022-04-05
BEIJING UNIV OF TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

On the basis of accurate detection of action categories, in order to solve the problems of insufficient expression of long action features and inaccurate regression of long action boundaries in the process of boundary detection, a time-series action recognition method based on deep learning is proposed to effectively improve the accuracy of predicted action segments. The degree of overlap of action segments

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A temporal sequence action recognition method based on deep learning
  • A temporal sequence action recognition method based on deep learning
  • A temporal sequence action recognition method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0040] In order to improve the subjective quality of the video, the present invention considers the length limit of long motions during multi-scale construction, and proposes a brand-new splicing mechanism for incomplete motion segments, which effectively improves the accuracy of long motion boundaries, and by considering contextual information, Further accurately identify action segments. The invention discloses a time series action detection method based on deep learning, the flow is as follows figure 1 as shown,

[0041] Specifically follow the steps below:

[0042] The present invention selects the temporal action detection data set THUMOS Challenge 2014 as the experimental database, which contains 20 types of undivided videos containing temporal action tags, and the present invention selects 200 of the verifier videos (including 3007 b...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a time series action recognition method based on deep learning, including video feature extraction and time boundary regression model construction. Aiming at the problem of insufficient validity of long motion feature expression in the boundary detection process, the inter-frame information and intra-frame information are simultaneously extracted through a dual-stream network to obtain the feature sequence of the video unit, and a multi-scale short motion segment interception scheme combined with context information is proposed. Effectively improve the subsequent regression accuracy, use the feature sequence to train the time boundary model, reduce the model training time, and improve the calculation efficiency. Aiming at the inaccurate problem of long motion boundary regression, the present invention proposes an improved temporal boundary regression model, including an improved multi-task multi-layer perceptron and a brand-new splicing mechanism for long motions, on the basis of ensuring the accuracy of motion categories , effectively improve the accuracy of long action time boundary regression, increase the overlap between the predicted action segment and the actual action segment, and realize the improvement of the sequence action recognition rate.

Description

technical field [0001] The invention belongs to the field of computer vision and pattern recognition, and relates to a time sequence action recognition method based on deep learning. Background technique [0002] With the rapid development of smart phones and the Internet, video data has begun to show a blowout phenomenon, so research in the field of computer vision is also gradually expanding in the direction of video data. The basis of video processing is action recognition. Although traditional action recognition has achieved a high recognition rate, because the original data must be a short video with a fixed frame number after cropping, it is required to include a single action tag. Such cropping requirements are too harsh. , but in practical applications, actions appear randomly in long videos, so traditional action recognition algorithms cannot meet the actual application scenarios. Time-sequence motion detection is a specific study of such uncropped original long vi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V20/40G06T7/269
CPCG06T7/269G06V20/49G06V20/46
Inventor 蔡轶珩孔欣然王雪艳李媛媛
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products