Human upper body motion recognition method based on key frame and stochastic forest regression

A technology of random forest and action recognition, applied in character and pattern recognition, computer parts, instruments, etc., can solve the problem of low recognition accuracy and achieve the effect of improving the correct recognition effect

Inactive Publication Date: 2018-12-28
CHANGCHUN UNIV OF SCI & TECH
View PDF8 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0022] In order to make up for the shortcomings of the existing Kinect-based upper body motion recognition method and solve the problem that only simple motions can be recognized and the recognition accuracy is not high, the purpose of the present invention is to provide a human body upper body motion recognition method based on key frame and random forest regression , which uses OptiTrack and its supporting software (hereinafter referred to as OptiTrack) and Kinect v2 to obtain the key joint coordinates of the upper body of the human body at the same time; use the method of random forest regression to learn the difference between the eigenvalues ​​​​between the key frames obtained from Kinect and the key frames obtained from OptiTrack The regression function finally realizes two functions: (1) Input the frame obtained from Kinect, predict the difference of feature value by random forest, and then correct the skeleton; (2) input the frame obtained from Kinect, and judge the action by predicting the pose mark by random forest

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human upper body motion recognition method based on key frame and stochastic forest regression
  • Human upper body motion recognition method based on key frame and stochastic forest regression
  • Human upper body motion recognition method based on key frame and stochastic forest regression

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The present invention will be further described below in conjunction with the accompanying drawings and embodiments. The accompanying drawings are diagrams of a schematic nature and do not limit the present invention in any way.

[0041] Embodiments of the present invention will be described in detail.

[0042] Step 1. Use Optitrack and Kinect v2 to obtain the joint coordinates of the upper body of the human body. Using the 12 FLEX:V100R2 lenses of the OptiTrack full-body motion capture system, they are arranged according to the layout of the 12 lenses and human body markers in the standard OptiTrack system; the positions of the marker points on the upper body of the human body are collected, and the joint point coordinates are calculated respectively, and converted to Kinect The bone coordinate system of v2; OptiTrack sampling frequency is set to 90FPS. Kinect v2 collects the joint coordinates of the upper body of the human body at the same time.

[0043] Step 2. Ext...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a human upper body movement recognition method based on key frames and random forest regression, which is characterized in that: key joint coordinates of the human upper bodyare simultaneously obtained by using OptiTrack and its supporting software OptiTrack and Kinect v2; the key joint coordinates of the human upper body are simultaneously obtained by using OptiTrack andKinect v2. The regression function of the eigenvalue difference between the key frames from Kinect and OptiTrack is studied by using the method of stochastic forest regression, and two functions arerealized: (1) inputting the frame from Kinect, predict the eigenvalue difference from the stochastic forest, and then correcting the skeleton; (2) inputting the frame obtained from Kinect, and judgingthe action by the random forest prediction gesture mark. The method can make up for the shortcomings of the existing Kinect-based upper body motion recognition methods, and solve the problem that only simple motion can be recognized and the recognition accuracy is not high.

Description

technical field [0001] The invention relates to a human upper body action recognition method based on key frames and random forest regression, belonging to the technical field of computer pattern recognition. Background technique [0002] Although people have carried out a series of researches on human motion recognition based on the Kinect human skeleton in recent years, the core technology is to capture human motion postures through Kinect, and then perform human motion recognition and analysis. However, there are still shortcomings such as low recognition accuracy, weak robustness, and poor scalability. Kinect bone recognition is not completely accurate when the joint points of the bone are occluded. Compared with the lower body of the human body, the posture and movement of the upper body can express more information. However, there are few recognition methods for the upper body movement based on Kinect. Most of the methods are recognized without joint occlusion, and so...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/20G06F18/285G06F18/23G06F18/24323
Inventor 白宝兴李波韩成杨帆张超胡汉平权巍赵璘白烨
Owner CHANGCHUN UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products