Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A complex behavior recognition method

A recognition method and behavior technology, applied in the field of image recognition, can solve problems such as incomplete descriptors, global feature deviation, loss of motion information, etc., and achieve the effect of improving the efficiency of action recognition, increasing the accuracy rate, and reducing the difficulty of calculation

Active Publication Date: 2021-09-28
SUZHOU UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Many methods extract a variety of high-level features from bone information, and then combine them in a certain form to form a descriptor, but the descriptor constructed by this combination mode is not complete, and there is always a loss of motion information.
On the other hand, many methods train different action classes separately, which leads to bias in the global features of human actions in the description of each individual class action
At the same time, the method of using high-level features still has the problem of high computational cost

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A complex behavior recognition method
  • A complex behavior recognition method
  • A complex behavior recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0094] Embodiment one: see Figure 7 As shown, a complex behavior recognition method includes the following steps:

[0095] (1) Use the depth sensor to obtain the three-dimensional bone joint point information of the target movement, and obtain the three-dimensional coordinates of each joint of the human body;

[0096] (2) Preprocess the bone joint point information and normalize the coordinate system, see figure 1 As shown, the horizontal axis is the vector from the left shoulder to the right shoulder, and the vertical axis is the vector from the hip bone to the midpoint of the shoulders. The coordinate system is normalized, and the X-Y-Z coordinate system is converted into the X’-Y’-Z’ coordinate system;

[0097] (3) Connect the 3D coordinates of each skeletal joint point in the action sequence in chronological order to obtain the 3D trajectories of all skeletal joint points;

[0098] This embodiment adopts a 60-frame action sequence S (swinging with both hands) with 20 sk...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a complex behavior recognition method, which includes: using a sensor to obtain three-dimensional bone joint point information of target movement; preprocessing the joint point information and normalizing the coordinate system; extracting the motion track of each joint point, and projecting it to three A two-dimensional plane; extract the motion vector and its length and direction angle between every two frames, use the k-means algorithm to cluster the motion primitives, and obtain the histogram by statistics; use the time pyramid to combine the time information, and combine all the histograms The value of the cluster is used to calculate the weight of each joint point to form a descriptor; the SVM is used to classify to realize action recognition. The present invention can extract and effectively represent the features of action skeleton joint point information, improve the accuracy of action recognition; all motion information is completely preserved, and action reconstruction can be performed; all action classes are clustered to capture human beings from a global perspective Action features; using low-level features reduces the difficulty of calculation, improves the efficiency of action recognition, and meets the real-time requirements of the system.

Description

technical field [0001] The invention relates to a complex behavior recognition method, which belongs to the technical field of image recognition. Background technique [0002] Action recognition is a research hotspot in the field of machine vision. Action recognition methods have been widely used in human-computer interaction, virtual reality, video retrieval, and security monitoring. With the development of depth cameras, the information of human bone joint points can be directly obtained, and the action recognition method based on bone features has greatly improved the accuracy of action recognition. Despite many related studies and exciting results, the effective description of human motion is still a challenging task. [0003] Many methods extract a variety of high-level features from bone information, and then combine them in a certain form to form a descriptor, but the descriptor constructed by this combination mode is not complete, and there is always a loss of motio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00
CPCG06V40/20
Inventor 杨剑宇朱晨黄瑶
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products