Unlock instant, AI-driven research and patent intelligence for your innovation.

A real-time motion sequence localization method

A positioning method and action sequence technology, applied in neural learning methods, instruments, biological neural network models, etc., can solve the problems of low accuracy and recognition efficiency in action recognition, and cannot achieve real-time detection. Calibration cost, effect of replication and generation, strong accuracy

Active Publication Date: 2021-12-07
JIANGXI UNIV OF TECH
View PDF12 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to propose a "one-step" real-time action timing positioning method to solve the problem of low accuracy and recognition efficiency in action recognition in the prior art, and the problem that real-time detection cannot be achieved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A real-time motion sequence localization method
  • A real-time motion sequence localization method
  • A real-time motion sequence localization method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] In order to make the purpose, features and advantages of the present invention more obvious and understandable, the specific implementation manners of the present invention will be described in detail below in conjunction with the accompanying drawings. Several embodiments of the invention are shown in the drawings. However, the present invention can be embodied in many different forms and is not limited to the embodiments described herein. Rather, these embodiments are provided so that the disclosure of the present invention will be thorough and complete.

[0033] An embodiment of the present invention proposes a real-time action timing positioning method, including steps S1-S4.

[0034] S1, track the joint points frame by frame from the motion sequence, calculate the motion elements, and form these motion elements into a dense joint motion matrix sorted by time and each individual joint, which is a three-dimensional matrix.

[0035] The expression of the dense joint...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a real-time motion timing positioning method, including: tracking human joint points frame by frame from the deep motion sequence, calculating joint motion information, and composing the motion information into a dense joint motion matrix sorted by time and joint sequence; On the basis of this dense joint motion matrix, a bilinear quadratic interpolation algorithm is used to generate multiple time-series latitude-unified action matrices, so that the deep neural network can train sample sets of multiple time scales; the introduction of spatial pyramid pooling The layer replaces the flatten layer in the classic convolutional neural network to obtain an improved convolutional neural network that can accept any input size; it uses a long-term priority time exploration strategy for locating actions in continuous behavioral videos. The invention can solve the problem that the accuracy rate and recognition efficiency of the prior art action recognition are low, and real-time detection cannot be achieved.

Description

technical field [0001] The present invention relates to the technical field of motion sequence positioning, in particular to a real-time motion sequence positioning method. Background technique [0002] Human action recognition (HAR) has a wide range of industrial applications, such as video retrieval, video summarization, virtual reality, and human-computer interaction, etc. In recent years, with the help of deep learning technology, the accuracy of HAR has improved. However, in practice, human action videos are usually long and complex, and contain multiple distinct actions. Based on this, action timing localization (TAL) has attracted more and more attention from researchers. TAL aims to identify the start key frame (SKF) and end key frame (EKF) of each action in a long and complex video. TAL mainly Answer two questions: when the action occurs and ends, and what category the action belongs to. [0003] The traditional mainstream methods of TAL are all based on hand-cra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 姚磊岳杨威
Owner JIANGXI UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More