Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Action Recognition Method Based on Segmented Body Model Applied in Human-Machine Collaboration

A human body model and motion recognition technology, applied in the field of human-computer interaction, can solve problems such as differences, large amount of information, and high complexity of human motions

Active Publication Date: 2021-06-15
NORTHWESTERN POLYTECHNICAL UNIV
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, human actions have the characteristics of high complexity, large amount of information, different users may have differences in the same action, different actions may have repetitive gestures, and may have meaningless actions, etc., which makes the recognition of human actions difficult. Big, is a cross topic involving computer vision, pattern recognition, artificial intelligence and other fields

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action Recognition Method Based on Segmented Body Model Applied in Human-Machine Collaboration
  • Action Recognition Method Based on Segmented Body Model Applied in Human-Machine Collaboration
  • Action Recognition Method Based on Segmented Body Model Applied in Human-Machine Collaboration

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0089] Step 1: Map the action sequence to be recognized and the joint point data of the action template to 3D, store it as a point cloud, and perform preprocessing, including translation, scaling, and rotation;

[0090] KinectV2 sensor tracking bone data is about 30frame / second, so the frame can be used as the time unit of the bone node.

[0091] The joint data extracted by the Kinect V2 sensor contains 30 frames of data per second, and each frame contains the coordinate information of 25 joint points. In order to store, transmit and read the joint information in the action sequence conveniently and quickly, the innovative The use of point clouds (PCD file format) to store action sequences.

[0092] A point cloud is a collection of a large number of points. It is a data storage structure that has emerged in recent years in applications and three-dimensional reconstruction. It has many file formats. The present invention uses the PCD format three-dimensional ordered points defi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an action recognition method based on a segmented human body model applied in human-computer cooperation. The Microsoft Kinect V2 sensor is used to collect human skeleton information to obtain joint node coordinates; after preprocessing, the skeleton node sequence is mapped from four dimensions to three dimensions, Stored in the form of a point cloud sequence; divide the human body model into three parts: upper limbs, lower limbs and torso, extract feature vectors and Boolean value feature matrices of relative positions of joint points respectively, use Boolean value feature matrices to extract key frames respectively, use feature vectors and The dynamic time warping algorithm (DTW algorithm) performs template matching separately; finally, the recognition results of the three parts are combined to obtain the classification of the overall movement of the human body. The present invention can not only achieve the purpose of recognizing the overall movement of the human body, but also obtain the movement description of the upper limbs, torso and lower limbs of the human body, and can identify human movements and behaviors in more detail and accurately, so as to help robots under human-machine cooperation to perform subsequent tasks planning.

Description

technical field [0001] The invention belongs to the field of human-computer interaction, and relates to an action recognition method based on a segmented human body model applied in human-computer cooperation. Background technique [0002] With the development of robot technology, the application scenarios of robots are becoming wider and wider, and there are more and more intersections and integrations with other fields. At the same time, there are many scenarios that require collaborative operations between humans and robots. The collaborative operation of robots and humans can not only liberate human labor, but also help humans avoid high-risk operational tasks at certain times. Human-machine cooperation is one of the future development directions of intelligent robots in the future. [0003] Human-machine collaboration emphasizes the leading role of humans. It should enable robots to understand human intentions as accurately as possible under the premise of ensuring saf...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/62
CPCG06V40/23G06V20/46G06V10/267G06V10/751
Inventor 黄攀峰张博文刘正雄董刚奇孟中杰张夷斋张帆
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products