Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Online continuous human behavior identification method based on Kinect

A recognition method and behavioral technology, applied in character and pattern recognition, instruments, computer components, etc., to achieve high execution efficiency

Inactive Publication Date: 2017-05-10
XIDIAN UNIV
View PDF1 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although there are several human behavior recognition algorithms for continuous human behavior sequences, these methods still "first detect the start and end time points of each behavior to achieve human behavior segmentation, and then perform human behavior recognition"

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Online continuous human behavior identification method based on Kinect
  • Online continuous human behavior identification method based on Kinect
  • Online continuous human behavior identification method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0036] see Figure 1-5 , the Kinect-based online continuous human behavior recognition method proposed by the present invention, the algorithm flow is as follows figure 1 As shown, the implementation process of specific feature extraction, online segmentation, key pose and atomic action extraction, online pattern matching, and classification based on variable-length maximum entropy Markov model is implemented according to the following steps:

[0037] a) Feat...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an online continuous human behavior identification method based on Kinect, comprising the following steps: (a) extracting human skeleton information from an RGB-D image collected by Kinect, and calculating the normalized relative orientation feature of each joint; (b) carrying out online dynamic segmentation on a feature sequence through an online segmentation method based on feature sequence potential difference to get a gesture feature fragment and an action feature segment; (c) respectively extracting key gestures and atomic actions from the gesture feature fragment and the action feature segment obtained through segmentation; (d) carrying out online mode matching between the feature segments obtained through segmentation and key gestures or atomic actions obtained through offline training, and calculating the likelihood probability that the feature segments are identified as the key gestures or atomic actions of a kind of behavior; and (e) using a variable-length maximum entropy Markov model to identify human behaviors based on the likelihood probability calculated. Compared with the known algorithms, there is no need to detect the start and end time points of each human behavior in advance, and identification can be executed online and in real time.

Description

technical field [0001] The invention relates to the technical field of intelligent human-computer interaction and intelligent robot, in particular to a Kinect-based online continuous human behavior recognition method. Background technique [0002] Human beings will enter an aging society in the 21st century, and the development of service robots can make up for the serious shortage of young labor and solve social problems such as family services and medical services in an aging society. The International Federation of Robotics gives a preliminary definition of service robots: a service robot is a semi-autonomous or fully autonomous robot that can perform services that are beneficial to humans, but does not include equipment engaged in production; if a service robot can be intelligent with humans Friendly interaction, and being able to engage in some household service work according to people's behaviors in daily life, then the application of service robots in the household s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62
CPCG06V40/103G06V10/457G06F18/23
Inventor 朱光明张亮宋娟沈沛意张淑娥刘欢
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products