Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

A method of identifying motion of local matching window based on sliding window

A local matching and sliding window technology, applied in the field of video recognition, can solve the problems of inability to recognize similar actions and low recognition rate, and achieve the effect of improving action recognition rate, improving representation, and increasing time constraints

Active Publication Date: 2015-03-11
ZHEJIANG UNIV OF TECH
View PDF3 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the shortcomings of the action recognition method that similar actions cannot be recognized and the recognition rate is low, the present invention provides an action recognition method based on a sliding window local matching window that effectively recognizes similar actions and has a high recognition rate

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method of identifying motion of local matching window based on sliding window
  • A method of identifying motion of local matching window based on sliding window
  • A method of identifying motion of local matching window based on sliding window

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] The present invention will be further described below in conjunction with the accompanying drawings.

[0032] refer to figure 1 , an action recognition method based on a sliding window local matching window, comprising the following steps:

[0033] 1) Obtain the depth map sequence of the person in the scene from the stereo camera, extract the position of the 3D joint points from the depth map, and use the 3D displacement difference between the poses as the feature expression of the depth map of each frame;

[0034] 2) Use the clustering method to learn the descriptors in the training set to obtain a feature set, and use it to express the features of each descriptor, so as to obtain the coded representation of each frame of image;

[0035] 3) Use a local matching model based on a sliding window to divide the entire action image sequence into action segments, and obtain the feature histogram expression of each action segment;

[0036] The feature matching process of the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method of identifying a motion of a local matching window based on a sliding window includes the following steps: 1) obtaining a depth image sequence of a human in a scene from a stereo camera, extracting a position of a 3D pitch point from the depth image, and using a 3D displacement difference between postures as a feature expression for each frame of depth image; 2) studying descriptors in a training set using a cluster method to obtain a feature set, and using the feature set to perform feature expression on each descriptor, thereby obtaining an encoding expression of each image; 3) performing motion segment division on the whole motion image sequence using a local matching model based on the sliding window, and obtaining a feature histogram expression of each motion segment; and 4) connecting the feature histogram expressions of all segments in series using a long vector to obtain the feature expression of the entire motion. The method effectively identifies similar motions, and the rate of identifying is extremely high.

Description

technical field [0001] The invention relates to the field of video recognition, in particular to an action recognition method. Background technique [0002] Human motion feature description is the expression of human body posture information in sequence images, and it is an important part of human action recognition. Human body motion is a chained non-rigid body motion, that is, the motion of each part of the body is a rigid body motion, and from the overall point of view, the motion of the human body is highly nonlinear and non-rigid. [0003] Human action recognition is mainly divided into spatiotemporal-based action recognition methods and sequence-based action recognition methods. First construct a corresponding three-dimensional X-Y-T model for each action in the training video set, and then determine the type of the test action by matching the action sequence to be recognized with the three-dimensional X-Y-T model of each action in the training set. Time-space-based ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/66G06F17/30
CPCG06V40/103G06V10/507G06F18/23213
Inventor 陈胜勇王其超沃波海管秋王鑫汪晓妍王万良
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products