Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Action Recognition Method Based on Dense Trajectory Kernel Covariance Descriptor

A recognition method and a descriptor technology, applied in the field of video processing, can solve problems such as low accuracy of behavior recognition, ignoring the nonlinear relationship of features, and the inability to obtain complex relationships of features, so as to achieve the effect of improving description ability and accuracy

Active Publication Date: 2020-04-14
XIDIAN UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method takes into account the relationship between different features, it only considers the static features and does not consider the dynamic features of the behavior subject, resulting in low accuracy of behavior recognition
[0005] (2) Yi Y, Wang H. Motion keypoint trajectory and covariance descriptor for human action recognition [J]. The Visual Computer, 2017: 1-13, this method builds a trajectory-based covariance descriptor on the basis of the motion keypoint trajectory The variance descriptor can represent the linear relationship between different motion variables. However, this method ignores the nonlinear relationship between features and cannot capture the complex relationship between features in behavior recognition with complex environments.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action Recognition Method Based on Dense Trajectory Kernel Covariance Descriptor
  • Action Recognition Method Based on Dense Trajectory Kernel Covariance Descriptor
  • Action Recognition Method Based on Dense Trajectory Kernel Covariance Descriptor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] The implementation of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0020] refer to figure 1 , the behavior recognition steps based on the dense trajectory kernel covariance descriptor of the present invention are as follows:

[0021] Step 1, extract dense trajectories from the video sequence, and obtain trajectory cubes that curve along the trajectories.

[0022] (1.1) Densely sample the video sequence to obtain feature points;

[0023] (1.2) track the feature points obtained in subsequent video frames, and obtain a dense track with a length of L=15;

[0024] (1.3) In each trajectory, take each trajectory point on the trajectory as the center to select an image block of W×H size, and obtain a trajectory cube with a size of W×H×L that curves along the trajectory, W=32, H= 32;

[0025] This example uses the method in the article Action recognition by dense trajectories published by Wang H et al. on Co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a behavior recognition method based on a dense trajectory kernel covariance descriptor, which mainly solves the problem that the prior art fails to consider the nonlinear correlation between different features, resulting in low accuracy of behavior recognition. The implementation steps are: 1) extract dense trajectory, extract features for each pixel in the trajectory cube to obtain the underlying feature matrix; 2) calculate the kernel covariance matrix of the underlying feature matrix, and map it to the Euclidean space to obtain vectorized features 3) Use all the feature representations in the trajectory cube to construct a dense trajectory-based kernel covariance matrix descriptor; 4) Encode the kernel covariance matrix descriptor with the BOW model to obtain a codeword histogram, and use the codewords of the training set Histogram training SVM, test the codeword histogram of the test set in the trained SVM, and obtain the behavior recognition result. The invention further improves the ability to describe behaviors, and can be used in complex environments such as video surveillance.

Description

technical field [0001] The invention belongs to the technical field of video processing, and in particular relates to a behavior recognition method, which can be used to describe video behavior in complex video monitoring environments. Background technique [0002] The wide application of video behavior recognition in the fields of human-computer interaction, virtual reality, video surveillance, and video retrieval and analysis has aroused the interest of more and more researchers, and has important academic research value and strong practical value. In the field of action recognition, the existence of factors such as viewing angle changes and complex backgrounds increases the difficulty of action recognition. In this case, due to the robustness of artificial local features to video noise, illumination changes and complex backgrounds, it has become a important research directions. At present, the most popular artificial local features are gradient direction histogram HOG ba...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V20/41G06F18/2411G06F18/214
Inventor 同鸣赵梦傲汪厚峄闫娜
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products