Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for identifying human body behavior based on part clustering feature

A recognition method and clustering technology, applied in the field of computer vision and pattern recognition, can solve the problems of inaccuracy, no consideration of the independence of joint motion, and no consideration of time series, etc.

Active Publication Date: 2017-06-30
BEIJING UNIV OF TECH
View PDF2 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method does not take into account the characteristics of the time series, so that the histogram of the recorded joint information loses the continuous information of the sequence; and their method does not consider the motion independence of each joint in the codebook formation stage in action recognition.
[0007] In addition, Microsoft's Kinect camera can not only obtain the depth map of the human body but also provide the position information of 16 joint points of the human body when shooting the human body. Most scholars' research is based on the joint point information provided by Microsoft's Kinect. Human body motion recognition, but when Kinect is shooting a human body, the first 20 frames or so will be used to judge and recognize the position of the human body in the picture. At this time, it cannot provide the position information of the joint points of the human body. For example, when the human body transitions from an upright state to a kicking action, the position of the joint points given by Kinect will have a considerable offset, which is not accurate enough, such as figure 1 shown

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for identifying human body behavior based on part clustering feature
  • Method for identifying human body behavior based on part clustering feature
  • Method for identifying human body behavior based on part clustering feature

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] The example of the present invention provides a human body behavior recognition method based on part clustering features. In order to avoid inaccurate human body joint point position information, the human body sub-part clustering center is used as the feature point representing the human body posture; in order to use the global characteristics of the action sequence information , the present invention adds a global position offset to the sequence feature vector to make up for the defect of only using local position offset information for identification. Based on this, the key issues that need to be solved include: the extraction of human body posture features; the calculation of human action sequence feature vectors; action recognition classification.

[0058] The present invention takes the depth image sequence when the human body is moving as input data to calculate the human action category as the output; wherein, the core link of the calculation is to use the offset...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for identifying a human body behavior based on a part clustering feature. The method comprises: step one, at a training stage, a part clustering feature point of each frame of a training video is extracted based on attitude estimation and then a local position offset and a global position offset of each feature point of each frame are calculated; feature point offset information of all the training videos are collected, the offset information is clustered by using a K-means clustering algorithm to obtain a cluster center, thereby forming a code book; and then a current training video is expressed by using a histogram of a group of joint feature points based on the code book; and step two, at a testing stage, for a tested video, a histogram is established by using the code book obtained at the training stage; and then a difference between the histogram at the testing stage and the histogram at the training stage is obtained by comparison by using a Naive Bayes nearest neighbor classification method, thereby carrying out behavior identification. The method has a high identification rate.

Description

technical field [0001] The invention belongs to the fields of computer vision and pattern recognition, and in particular relates to a human body behavior recognition method based on part clustering features. Background technique [0002] In recent years, human behavior recognition has received more and more attention. It is particularly critical to understand what people are doing and even infer their intentions by analyzing the interaction between people and objects. It is crucial for intelligent systems, which can be widely applied in many practical applications, such as intelligent video surveillance, motion retrieval, human-computer interaction, and healthcare, among many other fields. For example, in order to build a human-computer interaction system that can intelligently serve humans, the system not only needs to perceive the motion of the human body, but also understand the semantics of human actions and infer their intentions. [0003] At present, the traditional a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/23G06F18/23213G06F18/24155
Inventor 孔德慧贾文浩孙彬王少帆
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products