Complex behavior recognition method

A recognition method and behavior technology, applied in the field of image recognition, can solve the problems of global feature deviation, high computational cost, incomplete descriptors, etc.

Active Publication Date: 2018-10-19
SUZHOU UNIV
View PDF4 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Many methods extract a variety of high-level features from bone information, and then combine them in a certain form to form a descriptor, but the descriptor constructed by this combination mode is not complete, and there is always a loss of motion information.
On the othe

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Complex behavior recognition method
  • Complex behavior recognition method
  • Complex behavior recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0094] Embodiment one: see Figure 7 As shown, a complex behavior recognition method includes the following steps:

[0095] (1) Use the depth sensor to obtain the three-dimensional bone joint point information of the target movement, and obtain the three-dimensional coordinates of each joint of the human body;

[0096] (2) Preprocess the bone joint point information and normalize the coordinate system, see figure 1 As shown, the horizontal axis is the vector from the left shoulder to the right shoulder, and the vertical axis is the vector from the hip bone to the midpoint of the shoulders. The coordinate system is normalized, and the X-Y-Z coordinate system is converted into the X’-Y’-Z’ coordinate system;

[0097] (3) Connect the 3D coordinates of each skeletal joint point in the action sequence in chronological order to obtain the 3D trajectories of all skeletal joint points;

[0098] This embodiment adopts a 60-frame action sequence S (swinging with both hands) with 20 sk...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a complex behavior recognition method. The complex behavior recognition method comprises: acquiring three-dimensional bone joint point information of a target motion by using asensor; pre-processing the joint point information, and normalizing a coordinate system; extracting a motion trajectory of each joint point, projecting the motion trajectory to three two-dimensionalplanes; extracting a motion vector between each two frames and the length and the direction angle thereof, using the k-means algorithm to perform clustering to acquire motion primitives, and performing counting to acquire histograms; calculating the weight of each joint point is calculated to form a descriptor according to the time pyramid, the time information, and values of the clusters of allthe histograms; performing SVM classification to achieve motion recognition. The invention can perform feature extraction and effective represent on the motion bone joint point information, and can improve the accuracy of the motion recognition; all the motion information is completely retained, and the motion reconstruction can be performed; all motion classes are clustered, and motion features of the human are captured globally; low-level features are used, the computational difficulty is lowered, the motion recognition efficiency is improved, and the the real-time requirements of the systemis satisfied.

Description

technical field [0001] The invention relates to a complex behavior recognition method, which belongs to the technical field of image recognition. Background technique [0002] Action recognition is a research hotspot in the field of machine vision. Action recognition methods have been widely used in human-computer interaction, virtual reality, video retrieval, and security monitoring. With the development of depth cameras, the information of human bone joint points can be directly obtained, and the action recognition method based on bone features has greatly improved the accuracy of action recognition. Despite many related studies and exciting results, the effective description of human motion is still a challenging task. [0003] Many methods extract a variety of high-level features from bone information, and then combine them in a certain form to form a descriptor, but the descriptor constructed by this combination mode is not complete, and there is always a loss of motio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00
CPCG06V40/20
Inventor 杨剑宇朱晨黄瑶
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products