Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Space-time bone characteristic and depth belief network-based human body behavior identification method

A deep belief network and recognition method technology, applied in the field of human behavior recognition, can solve the problems of complex feature extraction of human behavior recognition, easy to be affected by changes in lighting conditions and complex backgrounds, and high computing costs

Inactive Publication Date: 2018-09-14
NORTHEAST DIANLI UNIVERSITY
View PDF2 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Among them, human action recognition based on color video streams is easily affected by changes in lighting conditions and complex backgrounds.
Human behavior recognition feature extraction based on depth images is complex and computationally expensive

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Space-time bone characteristic and depth belief network-based human body behavior identification method
  • Space-time bone characteristic and depth belief network-based human body behavior identification method
  • Space-time bone characteristic and depth belief network-based human body behavior identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] This embodiment provides a human behavior recognition method based on spatio-temporal skeletal features and a deep belief network. The skeletal points used in the method in this embodiment include the head skeletal point Head, the shoulder central skeletal point Shoulder Center, and the hip central skeletal point Hip Center, left shoulder bone point Left Shoulder, left elbow bone point Left Elbow, left hand bone point Left Hand, right shoulder bone point Right Shoulder, right elbow bone point Right Elbow, right hand bone point RightHand, left hip bone point Left Hip, left knee Bone point Left Knee, left foot bone point Left Foot, right hip bone point Right Hip, right knee bone point Right Knee, right foot bone point Right Foot.

[0063] The specific steps of the method are as follows:

[0064] Step 1: Get the skeleton sequence;

[0065] For each action, use the depth camera (Kinect v2 SDK can be used), take the camera coordinates as the origin, and obtain the name and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of computer vision and specifically relates to a space-time bone characteristic and depth belief network-based human body behavior identification method. According to the method, a bone sequence is obtained via a depth camera based on each motion; bone point time sequence characteristics are extracted from the bone sequence and comprise displacement characteristics, speed characteristics and acceleration characteristics of each bone point; a bone reference point is chosen, relative distance between each bone point and the bone reference point is calculated, space characteristics of a motion bone sequence is obtained, the bone point time sequence characteristics and the space characteristics of a motion bone sequence are subjected to clustering operation,a characteristic length is fixed, a global motion description operator is established and subjected to dimensionality reducing operation, and motion classification is finished based on cross validation and a classifier.

Description

technical field [0001] The invention belongs to the field of computer vision, and in particular relates to a human behavior recognition method based on spatiotemporal skeleton features and a deep belief network. The method can recognize human behavior according to the characteristics between human skeleton points. Background technique [0002] Human behavior recognition is one of the hotspots in the field of computer vision and artificial intelligence research, and has a wide range of applications in important fields such as intelligent monitoring and human-computer interaction. With the popularity of depth cameras, human behavior recognition algorithms based on 3D skeleton sequences have quickly become one of the key research directions in this field due to their simplicity and efficiency. [0003] At this stage, the research content on human behavior recognition is mainly divided into three directions: 1. Human behavior recognition based on color video stream; 2. Human beh...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/23G06F18/213G06F18/23G06F18/24
Inventor 孟勃王晓霖刘雪君
Owner NORTHEAST DIANLI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products