Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion recognition method

An action recognition and action technology, applied in the field of action recognition, can solve the problems of bone joint point position errors, limited depth camera accuracy, etc., and achieve the effect of discrimination

Active Publication Date: 2021-08-06
SUZHOU UNIV
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although bone-based action recognition has received more and more attention due to the development of cheap depth cameras, these methods are limited by the accuracy of depth cameras. will go wrong

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion recognition method
  • Motion recognition method
  • Motion recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0116] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0117] Such as figure 1 As shown, an action recognition method includes the following process:

[0118] 1. The total number of samples in the action video sample set is 2000, with a total of 10 action categories, and each action category has 200 action video samples. Three-quarters of the samples in each action category are randomly selected as the training set, and the remaining quarter is divided into the test set, and a total of 1500 training action video ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an action recognition method, and the method comprises the following steps: segmenting an action video sample, and calculating a dynamic image of each video clip; inputting the dynamic and static images of each video clip into a feature extractor; extracting a motion feature vector of the dynamic image; constructing a feature center group to obtain histogram expression; inputting the histogram expression into a histogram connection layer to obtain a complete histogram expression of the action video sample; inputting a multi-layer perceptron to form a motion feature quantization network; performing training to converge; inputting the dynamic and static images into a feature extractor, an enhancer and a soft quantizer in the trained motion feature quantization network to obtain histogram expression; inputting the histogram expression into a significant motion feature extractor to obtain a significant motion feature map; inputting the significant motion feature map into a convolutional neural network to form an action classifier; performing training to converge; calculating a dynamic image and a static image of each fragment of the test action video sample, and inputting the dynamic image and the static image into the trained action classifier to realize action recognition.

Description

technical field [0001] The invention relates to an action recognition method, which belongs to the technical field of action recognition. Background technique [0002] Action recognition is an important topic in the field of computer vision, and it has a wide range of applications in video surveillance, behavior analysis, human-computer interaction and other fields. Although bone-based action recognition has received more and more attention due to the development of cheap depth cameras, these methods are limited by the accuracy of depth cameras. will go wrong. Compared with depth cameras, RGB devices are more mature and reliable. Therefore, many scholars study action recognition based on RGB videos. [0003] Most existing methods perform action recognition by extracting image-level features of video frames. These methods are not dedicated to extracting motion features of actions in videos. However, for video analysis, it is very important to obtain dynamic information. Mot...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06V20/46G06V10/50G06N3/045G06F18/23G06F18/214G06F18/24G06V10/462G06V10/507G06V10/774G06V10/82G06V20/42G06V40/23G06V10/56G06V20/49G06V10/7715G06V20/41
Inventor 杨剑宇黄瑶
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products