Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion recognition method based on segmented mannequin model applied in human-machine cooperation

A human body model and action recognition technology, applied in the field of human-computer interaction, can solve problems such as differences, large amount of information, and difficulty in human action recognition.

Active Publication Date: 2018-12-25
NORTHWESTERN POLYTECHNICAL UNIV
View PDF5 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, human actions have the characteristics of high complexity, large amount of information, different users may have differences in the same action, different actions may have repetitive gestures, and may have meaningless actions, etc., which makes the recognition of human actions difficult. Big, is a cross topic involving computer vision, pattern recognition, artificial intelligence and other fields

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion recognition method based on segmented mannequin model applied in human-machine cooperation
  • Motion recognition method based on segmented mannequin model applied in human-machine cooperation
  • Motion recognition method based on segmented mannequin model applied in human-machine cooperation

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0089] Step 1: Map the action sequence to be recognized and the joint point data of the action template to 3D, store it as a point cloud, and perform preprocessing, including translation, scaling, and rotation;

[0090] KinectV2 sensor tracking bone data is about 30frame / second, so the frame can be used as the time unit of the bone node.

[0091] The joint data extracted by the Kinect V2 sensor contains 30 frames of data per second, and each frame contains the coordinate information of 25 joint points. In order to store, transmit and read the joint information in the action sequence conveniently and quickly, the innovative The use of point clouds (PCD file format) to store action sequences.

[0092] A point cloud is a collection of a large number of points. It is a data storage structure that has emerged in recent years in applications and three-dimensional reconstruction. It has many file formats. The present invention uses the PCD format three-dimensional ordered points defi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a motion recognition method based on a segmented human body model applied in human-machine cooperation, which uses a Microsoft Kinect V2 sensor to collect human body bone information and obtain joint node coordinates. After preprocessing, the skeletal node sequence is mapped from four dimensions to three dimensions and stored as a point cloud sequence. The mannequin is divided into upper limb, lower limb and trunk. The Boolean eigenvectors and the relative positions of joints are extracted respectively. The key frames are extracted by the Boolean eigenvectors, and thetemplates are matched by the eigenvectors and the dynamic time warping algorithm (DTW algorithm). Finally, the recognition results of the three parts are combined to get the classification of the whole human motion. The present invention not only achieves the purpose of recognizing the whole movement of the human body, but also obtains the description of the movement of the upper limb, the torso and the lower limb of the human body, and can recognize the movement and the behavior of the human body in more detail and accurately, so as to help the robot under the cooperation of the human and themachine to carry out the subsequent task planning.

Description

technical field [0001] The invention belongs to the field of human-computer interaction, and relates to an action recognition method based on a segmented human body model applied in human-computer cooperation. Background technique [0002] With the development of robot technology, the application scenarios of robots are becoming wider and wider, and there are more and more intersections and integrations with other fields. At the same time, there are many scenarios that require collaborative operations between humans and robots. The collaborative operation of robots and humans can not only liberate human labor, but also help humans avoid high-risk operational tasks at certain times. Human-machine cooperation is one of the future development directions of intelligent robots in the future. [0003] Human-machine collaboration emphasizes the leading role of humans. It should enable robots to understand human intentions as accurately as possible under the premise of ensuring saf...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/34G06K9/62
CPCG06V40/23G06V20/46G06V10/267G06V10/751
Inventor 黄攀峰张博文刘正雄董刚奇孟中杰张夷斋张帆
Owner NORTHWESTERN POLYTECHNICAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products