Quick human movement identification method oriented to human-computer interaction

A technology of human action recognition and human-computer interaction, applied in the input/output, character and pattern recognition, computer parts and other directions of user/computer interaction, can solve the problem of weak robustness, etc., to improve the speed of action recognition, Optimize the effect of quick control and convenient subsequent identification

Inactive Publication Date: 2018-04-13
SHENYANG POLYTECHNIC UNIV
View PDF7 Cites 32 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Ordinary cameras are often severely affected by the environment (background, lighting, occlusion, etc.), so action recognition needs to be performed in an ideal environment, and the robustness is not strong

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Quick human movement identification method oriented to human-computer interaction
  • Quick human movement identification method oriented to human-computer interaction
  • Quick human movement identification method oriented to human-computer interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0070] The present invention adopts Microsoft's 3D depth-of-field camera Kinect as an action acquisition device, and the camera has no requirement for ambient light, and complete darkness is also acceptable. When collecting actions, place the Kinect at a height of 1m from the ground, and the person to be collected stands facing the Kinect, keeping his body parallel to the camera plane, about 1-2m away from the camera, and there is no obstacle between the camera and the camera. According to the experimental requirements, 5 people in the laboratory were selected for action recording and testing, and the following three action template libraries were established:

[0071] (1) Customize 20 actions, select one of them, record these 20 actions once, and save the action template.

[0072] (2) Customize 20 kinds of actions, choose 5 people, each person will record 20 kinds of actions 10 times, and save the average value of 50 templates as a template.

[0073] (3) Use the second templ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a quick human movement identification method oriented to human-computer interaction. The method comprises the following steps that: (1) collecting bone point coordinate information; (2) selecting a key point; (3) extracting movement characteristics; (4) carrying out movement identification; and (5) using a robot. The invention provides the quick human movement identification method oriented to the human-computer interaction. The whole system consists of a terminal computer, Kinect human movement input equipment, a Bluetooth communication module and a robot. Firstly, theKinect is used for capturing a human body, effective nodes which can represent whole-body movement are extracted from 20 joint nodes and are calculated to obtain movement characteristics, and a movement template is formed and stored into a TXT text. In an identification stage, a movement sequence to be tested and a standard template are quickly matched through an F-DTW (Fast Dynamic Time Warping)algorithm, and an identification result is given. According to a movement identification result, the robot makes different responses. By use of a quick algorithm, movement identification speed is greatly improved, and quickly control for the robot is optimized.

Description

technical field [0001] The invention belongs to the fields of computer virtual reality and human-computer interaction, and in particular relates to a fast human-computer interaction-oriented recognition method for human-computer interaction, which utilizes human-computer interaction to control a robot and realizes human-computer interaction. Background technique [0002] With the development of robot control technology, the interaction between humans and robots has become more and more common, and the human-computer interaction technology is also developing rapidly, and the use of human body movements to control robots has become a research hotspot in the field of human-computer interaction. [0003] Traditional gesture recognition is contact recognition, wearing data gloves, or installing sensors such as gyroscopes on the body to perceive movements, so as to achieve the purpose of movement recognition. This method has high accuracy, but it needs to wear sensors on the body ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06K9/46
CPCG06F3/011G06V40/23G06V10/457
Inventor 桑海峰田秋洋
Owner SHENYANG POLYTECHNIC UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products