Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Kinect-based action training method

A training method and action technology, applied in the field of virtual reality, can solve problems such as hindering the automatic training system, and achieve the effects of reducing time complexity, simple installation process and simple equipment

Inactive Publication Date: 2012-06-20
BEIHANG UNIV
View PDF5 Cites 52 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, automatic training often requires expensive equipment support, which also prevents automatic training systems from entering ordinary homes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Kinect-based action training method
  • Kinect-based action training method
  • Kinect-based action training method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to better understand the technical solution of the present invention, a further detailed description will be made below in conjunction with the accompanying drawings and implementation examples.

[0034] 1. The steps of the online training method are as follows:

[0035] Online training first automatically divides the action into several key stages. When the user completes the action of one stage, the action of the next stage is automatically prompted. When the action of each stage is prompted, the key joint points of each stage are prompted. . Take an action of lifting and stretching the hand as an example,figure 1 It is a schematic diagram of online training, each row corresponds to a key action. The leftmost column is the result of using the motion data collected by kinect to drive the 3D model in real time, the middle column is the result of driving the 3D model with standard motion data, and the right is the color picture collected by kinect. When the us...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a kinect-based action training method, which can realize human body action extraction without marking points by acquiring human body action date through kinect, so that the action training is simpler and more practical. The kinect-based action training method is divided into two modes: on-line training and off-line training, wherein during the on-line training, the trained actions can be automatically divided into plurality of stages, a user can learn the actions by tracing the prompts of each stage, whether the user completes the actions of each stage or not is judged by kinetic energy and potential energy of the actions of the user, and important key points of the actions of the stage are automatically computed and are prompted to the user; and during the off-line training, the user firstly completes all the action by self, and the actions of the user can be automatically matched with standard actions by being caught by the kinect, comparison analysis can be carried out according to the matched actions so that the marking can be carried out according to the difference of the actions of the user and the standard actions on skeleton direction, and thus, the user can more intuitively find the difference.

Description

technical field [0001] The invention relates to an action training method, in particular to a kinect-based action training method, and belongs to the field of virtual reality. Background technique [0002] The automatic training system enables users to learn and train movements without a coach. However, automatic training often requires expensive equipment support, which also prevents automatic training systems from entering ordinary families. However, the emergence of some new devices has made applications that were difficult to implement possible, such as Microsoft's kinect, which can capture the posture of the human body in real time, and its price is cheap and can be accepted by the public. Therefore, some training items can be made into a game application, such as simple dance, aerobics, etc., and users can learn corresponding skills while entertaining. [0003] Akio Nakamura et al. designed a basic dance training system consisting of motion capture equipment, moving ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): A63B23/00A63B24/00G06F19/00
Inventor 周忠吴威梁进明
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products