Check patentability & draft patents in minutes with Patsnap Eureka AI!

Human body behavior identification method based on bone joint and surface feature fusion

A technology of surface features and joints, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as recognizing human behavior

Inactive Publication Date: 2016-11-23
BEIJING ROBOTLEO INTELLIGENT TECH
View PDF6 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the information provided by color video is usually not accurate enough to identify human behavior in practice

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body behavior identification method based on bone joint and surface feature fusion
  • Human body behavior identification method based on bone joint and surface feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] The present invention will be further described below in conjunction with the accompanying drawings.

[0020] A human behavior recognition method based on fusion of skeletal joint features and surface features, characterized in that it comprises the following steps:

[0021] Step 1: Obtain local deep joint features: use human bone joint features to construct a local change model of human activities;

[0022] Step 1.1: Obtaining a depth video sequence: using a Kinect sensor to obtain a video sequence with depth information;

[0023] Step 1.2: Capture bones and calculate distances between joints: For a video sequence V with T frames i , use the bone tracker to track each joint of each frame, and establish a 3D global coordinate system and a screen-based coordinate system with depth information;

[0024] In order to unify the positions of the joints in different images, the coordinate data are normalized and standardized:

[0025] Assuming that the number of joints is Q...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body behavior identification method, and particularly relates to a novel method for identifying behaviors according to a fusion framework fusing local joint features and global surface features of bones. The method comprises the following steps: firstly, collecting the joint features and the surface features of a whole sequence, and respectively training a Support Vector Machine (SVM) model for the collected features; secondly, performing label class matching on each feature (joint or surface) of a certain behavior to be detected; finally, fusing the two matched features, and calculating a feature probability through the fusion framework to identify the human body behavior. According to the method, distinctive information of each behavior of a human body can be provided, and by the fusion of local depth information and global depth information of people, the method can be used for identifying challenging human body behaviors.

Description

technical field [0001] The invention relates to the technical field of behavior recognition, in particular to a method for human body behavior recognition based on a fusion framework of bone joint features and surface features. Background technique [0002] Behavior recognition has a long research history, and investigations have shown that the features of bones and surfaces are closely related to the representation of 2D shapes. Key issues include in-group variation, such as people pose changes, distortion, self-occlusion, etc., and out-of-group noise, such that different behaviors may have similar appearances in practice. Early action recognition methods mainly dealt with color videos. In these methods, invariant keypoints are often used as local features to capture the behavior of objects. However, the information provided by color video is usually not accurate enough to recognize human behavior in practice. [0003] Skeleton-based representations can be learned well i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
Inventor 明安龙周瑜廖鸿宇孙放
Owner BEIJING ROBOTLEO INTELLIGENT TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More