Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method and device for recognizing throwing actions based on a single attitude sensor

A single-sensor, action technology, applied in the field of motion capture, can solve problems such as inability to capture and reduce interactive experience.

Active Publication Date: 2017-05-03
BEIJING HUARU TECH
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, human-computer interaction based on buttons cannot effectively capture the throwing process, strength and direction, and will reduce the interaction experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and device for recognizing throwing actions based on a single attitude sensor
  • A method and device for recognizing throwing actions based on a single attitude sensor
  • A method and device for recognizing throwing actions based on a single attitude sensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention. In addition, it should be noted that, for the convenience of description, only some structures related to the present invention are shown in the drawings but not all structures.

[0045] The throwing action can be regarded as a combination of a series of actions in a short period of time. By dividing all possible postures of the right upper arm in a certain coordinate system, the right half space of the reference coordinate system is divided into several regions, and each region represents a basic state. During the entire throwing process, the state of the upper right arm will be continuously and sequentially switched from one of several possible initial states to the final state.

[0046]...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a device for recognizing a throwing action based on a single attitude sensor. The method comprises the following steps of establishing a reference coordinate system relative to the right, the front and the lower part of a thrower, wherein a right upper arm vector is a unit vector pointing to the right elbow from the right shoulder, reference vectors (formula) respectively represent the lower part, the front and the right of the thrower, an inclined angle between the right upper arm vector and (formula) is theta, and an inclined angle between projection of the right upper arm vector on a plane formed by (formula) and (formula) and (formula) is phi; dividing the right half space of the reference coordinate system into a plurality of attitude areas and mapping a measured throwing area in the reference area; comparing the measured reference area, a time sequence and a predetermined throwing order, if the measured reference area, the time sequence and the predetermined throwing order are the same, judging a throwing action, otherwise, judging a non-throwing action. According to the method and the device, through configuring the single sensor, and collecting the attitude information and analyzing the attitude information in real time, the throwing attitude of a single pawn is recognized under the virtual training scene, the judgment accuracy rate is more than 98%, and the method and the device are easily popularized and applied in all kinds of immersed virtual training systems.

Description

technical field [0001] The present application relates to the field of motion capture, and in particular, relates to a method and device for recognizing throwing motions using a single attitude sensor. Background technique [0002] At present, the commonly used method for motion capture is optical motion capture. Optical motion capture completes the task of motion capture by monitoring and tracking specific light points on the human body. The principle is to use multiple cameras to continuously shoot the light points on the human body at the same time. The position of the point in space can be calculated by using the preset parameters, and then the attitude fusion calculation is performed according to the position to obtain the corresponding action. Its disadvantages are high cost and complex configuration. To prepare to capture action, a "pure" background environment is first required. The human body needs to be dressed in monochromatic clothing, equipped with light spots...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01
CPCG06F3/011
Inventor 陈敏杰张柯孙昊胡明昱
Owner BEIJING HUARU TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products