View angle-independent action identification method

A technology of action recognition and perspective, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problem of high recognition rate and achieve the effect of high recognition rate

Inactive Publication Date: 2009-05-06
XIAN UNIV OF TECH
View PDF0 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a view-independent action recognition method, which solves the limitation of

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • View angle-independent action identification method
  • View angle-independent action identification method
  • View angle-independent action identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0019] Action recognition method of the present invention, specific steps are as follows figure 1 As shown, by using three cameras to synchronously collect the human body movement data in the scene from the front, oblique, and side directions respectively, the 3D human body model is reconstructed by using the 3D carving method, and further through the dynamic part of the human body movement data Focus on extracting the motion weight model of the 3D body posture, and then use the 3D pseudo-Zernike moment to describe the perspective-independent human action characteristics, and model the action through the conditional random field to realize action recognition.

[0020] The method of the present invention will be described below by taking the recognition of the "kicking" action as an example.

[0021] Step 1: Use 3 cameras to collect action se...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for identifying an action irrelevant to visual angles. The method is executed according to the following steps: first, synchronously collecting human body video data in the forward direction, an oblique direction and a side direction, pre-treating the collected video data so as to obtain a binary human body profile information, and carving and reconstructing a three-dimensional figure of the human body according to the binary human body profile information in the three directions; then extracting the dynamic part of the motion process of the human body to form a motion dynamic energy body and a motion weight model of the three-dimensional figure, and performing characteristic description by utilizing a three-dimensional Zernike torque without changing proportion, displacement and rotation; finally using a conditional random field to establish a probability graph model for each action and performing identification. The method for identifying actions irrelevant to visual angles overcomes the restriction that an existing monitoring scene requires a fixed visual angle, and realizes the identification of human body actions in any directions, and the identification rate is high.

Description

technical field [0001] The invention belongs to the technical field of intelligent visual monitoring, and in particular relates to an action recognition method irrelevant to an angle of view. Background technique [0002] With the rapid development of science and technology, the use of cameras to monitor scenes has been widely used in all aspects of society. The main target in the monitoring scene is people, and the recognition of various actions of the human body helps to prevent and stop crimes. So far, most research work on action recognition has been carried out under a fixed perspective. Although a small number of viewpoint-invariant representations are used for research, most of them have some defects, such as lack of sufficient information for recognition, relying on robust semantic feature point detection or point correspondence. Since the angle between the direction of human body movement in the scene and the shooting direction of the camera is arbitrary, it is im...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/20G06K9/54G06K9/62
Inventor 张二虎赵永伟
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products