Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Systems and methods for personalized motion control

Inactive Publication Date: 2011-02-24
YEN WEI
View PDF2 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0053]receiving a second set of motion signals from a second motion sensing device providing sufficient information to compute position and orientation over time of the second motion sensing device; and

Problems solved by technology

Our ability to fulfill the promise of freeform human motion control of software applications is strictly limited by our ability to detect and recognize what a given human is trying to do.
Humans are very good at interpreting the gestures and expressions of other humans, but are yet unable to create machines or code that can perform at a similar level.
Writing program code to recognize whether a supplied motion is an example of an existing set of known motion classes is difficult.
The resulting motion data is often complicated and counterintuitive.
For example, when presented with a simple graph of 3D accelerometer outputs versus time, people skilled in the art struggle to determine what gesture that time series of data corresponds to.
Even the simpler task of selecting which motion graphs belong to the same gesture confounds most experts presented with the problem.
The problem is exacerbated by sensor noise, device differences, and the fact that data for the same gesture can appear quite different when performed by different people with different body types and musculatures, or even by the same person at different times. It is a difficult challenge under these conditions for one skilled in the art to build effective motion recognizers.
Along with challenging source data, the fact that the data is dynamic over time, not static over time, is a significant hurdle to overcome.
To our knowledge, there is nothing in the prior art that provides teaching related to end-user creation of ad-hoc motion recognizers.
This approach does not work in applications that wish to provide freeform motion control, since the starting and ending positions are not predefined, and can not reasonably be quantized a priori without making the construction of a reasonable training set a virtual impossibility.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Systems and methods for personalized motion control
  • Systems and methods for personalized motion control
  • Systems and methods for personalized motion control

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066]The detailed description of the invention is presented largely in terms of procedures, steps, logic blocks, processing, and other symbolic representations that directly or indirectly resemble the operations of data processing devices. These process descriptions and representations are typically used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.

[0067]Numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will become obvious to those skilled in the art that the invention may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuitry have not been described in detail to avoid unnecessarily obscuring aspects of the present invention.

[0068]Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

End users, unskilled in the art, generating motion recognizers from example motions, without substantial programming, without limitation to any fixed set of well-known gestures, and without limitation to motions that occur substantially in a plane, or are substantially predefined in scope. From example motions for each class of motion to be recognized, a system automatically generates motion recognizers using machine learning techniques. Those motion recognizers can be incorporated into an end-user application, with the effect that when a user of the application supplies a motion, those motion recognizers will recognize the motion as an example of one of the known classes of motion. Motion recognizers can be incorporated into an end-user application; tuned to improve recognition rates for subsequent motions to allow end-users to add new example motions.

Description

CROSS REFERENCE TO RELATED APPLICATIONS[0001]This is a continuation-in-part of co-pending U.S. application Ser. No. 11 / 486,997, entitled “Generating Motion Recognizers for Arbitrary Motions”, filed Jul. 14, 2006, and co-pending U.S. application Ser. No. 12 / 020,431, entitled “Self-Contained Inertial Navigation System for Interactive Control Using Movable Controllers”, filed Jan. 25, 2008.BACKGROUND OF THE INVENTION[0002]1. Field of the Invention[0003]The invention generally relates to the area of artificial intelligence, and more particularly, relates to machine learning, especially in the context of generating motion recognizers from example motions. In some embodiments, recognizer makers can be incorporated into, or used alongside of end-user applications, where end users can create ad-hoc personalized motion recognizers for use with those end-user applications.[0004]2. Related Art[0005]Our ability to fulfill the promise of freeform human motion control of software applications is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06F3/038
CPCA63F13/06A63F13/10A63F2300/1043A63F2300/105G06F3/038A63F2300/6045G06F3/017G06F3/0346G06F2203/0384A63F2300/6027A63F13/215A63F13/92A63F13/428A63F13/211A63F13/2145A63F13/67A63F13/235
Inventor TU, XIAOYUANKAWANO, YOICHIROMUSICK, JR., CHARLESPOWERS, III, WILLIAM ROBERTREYNOLDS, STUARTWILKINSON, DANAWRIGHT, IANYEN, WEI
Owner YEN WEI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products