Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System and method for executing a game process

Inactive Publication Date: 2010-06-03
MICROSOFT TECH LICENSING LLC
View PDF12 Cites 194 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0019]In another aspect of the present invention, the system includes a wireless device worn by the person. The wireless device includes one or more sensors that measure at least velocity, acceleration, and orientation of the device. The corresponding signals are transmitted to a computer system, processed, and interpreted to determine an object at which the device is pointed and the action to be taken on the object. Once the signals have been interpreted, the computer is controlled to interact with the object, which object can be a device and / or system connected to the computer, and software running on the computer. In one application, the wireless device is used in a medical environment and worn on the head of a medical person allowing free use of the hands. Head movements facilitate control of the computer. In another multimodal approach, the person can also wear a wireless microphone to communicate voice signals to the computer separately or in combination with head movements for control thereof.
[0023]In accordance with another aspect thereof, the present invention facilitates adapting the system to the particular preferences of an individual user. The system and method allow the user to tailor the system to recognize specific hand gestures and verbal commands and to associate these hand gestures and verbal commands with particular actions to be taken. This capability allows different users, which may prefer to make different motions for a given command, the ability to tailor the system in a way most efficient for their personal use. Similarly, different users can choose to use different verbal commands to perform the same function.

Problems solved by technology

There are, however, many applications where the traditional user interface is less practical or efficient.
The traditional computer interface is not ideal for a number of applications.
Manipulation of the presentation by the presenter is generally controlled through use of awkward remote controls, which frequently suffer from inconsistent and less precise operation, or require the cooperation of another individual.
Switching between sources, advancing fast fast-forward, rewinding, changing chapters, changing volume, etc., can be very cumbersome in a professional studio as well as in the home.
Similarly, traditional interfaces are not well suited for smaller, specialized electronic gadgets.
Additionally, people with motion impairment conditions find it very challenging to cope with traditional user interfaces and computer access systems.
These conditions and disorders are often accompanied by tremors, spasms, loss of coordination, restricted range of movement, reduced muscle strength, and other motion impairing symptoms.
As people age, their motor skills decline and impact the ability to perform many tasks.
It is known that as people age, their cognitive, perceptual and motor skills decline, with negative effects in their ability to perform many tasks.
The requirement to position a cursor, particularly with smaller graphical presentations, can often be a significant barrier for elderly or afflicted computer users.
However, at the same time, this shift in the user interaction from a primarily text-oriented experience to a point-and-click experience has erected new barriers between people with disabilities and the computer.
For example, for older adults, there is evidence that using the mouse can be quite challenging.
It has been shown that even experienced older computer users move a cursor much more slowly and less accurately than their younger counterparts.
In addition, older adults seem to have increased difficulty (as compared to younger users) when targets become smaller.
For older computer users, positioning a cursor can be a severe limitation.
One solution to the problem of decreased ability to position the cursor with a mouse is to simply increase the size of the targets in computer displays, which can often be counter-productive since less information is being displayed, requiring more navigation.
Previous studies indicate that users find gesture-based systems highly desirable, but that users are also dissatisfied with the recognition accuracy of gesture recognizers.
Furthermore, experimental results indicate that a user's difficulty with gestures is in part due to a lack of understanding of how gesture recognition works.
However, examples of perceptual user interfaces to date are dependent on significant limiting assumptions.
Proper operation of the system is dependent on proper lighting conditions and can be negatively impacted when the system is moved from one location to another as a result of changes in lighting conditions, or simply when the lighting conditions change in the room.
The use of such devices is generally found to be distracting and intrusive for the user.
Thus perceptual user interfaces have been slow to emerge.
The reasons include heavy computational burdens, unreasonable calibration demands, required use of intrusive and distracting devices, and a general lack of robustness outside of specific laboratory conditions.
For these and similar reasons, there has been little advancement in systems and methods for exploiting perceptual user interfaces.
However, as the trend towards smaller, specialized electronic gadgets continues to grow, so does the need for alternate methods for interaction between the user and the electronic device.
Many of these specialized devices are too small and the applications unsophisticated to utilize the traditional input keyboard and mouse devices.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and method for executing a game process
  • System and method for executing a game process
  • System and method for executing a game process

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051]As used in this application, the terms “component” and “system” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and / or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components may reside within a process and / or thread of execution and a component may be localized on one computer and / or distributed between two or more computers.

[0052]The present invention relates to a system and methodology for implementing a perceptual user interface comprising alternative modalities for controlling computer programs and manipulating on-screen objects through hand gestures or a combination of hand gestures and / or verbal commands. A perceptual user interface...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A 3-D imaging system for recognition and interpretation of gestures to control a computer. The system includes a 3-D imaging system that performs gesture recognition and interpretation based on a previous mapping of a plurality of hand poses and orientations to user commands for a given user. When the user is identified to the system, the imaging system images gestures presented by the user, performs a lookup for the user command associated with the captured image(s), and executes the user command(s) to effect control of the computer, programs, and connected devices.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application is a continuation-in-part of pending U.S. patent application Ser. No. 10 / 396,653 entitled “ARCHITECTURE FOR CONTROLLING A COMPUTER USING HAND GESTURES”, which was filed Mar. 25, 20032, the entirety of which is incorporated by reference.TECHNICAL FIELD[0002]The present invention relates generally to controlling a computer system, and more particularly to a system and method to implement alternative modalities for controlling computer programs and devices, and manipulating on-screen objects through the use of one or more body gestures, or a combination of gestures and supplementary signals.BACKGROUND OF THE INVENTION[0003]A user interface facilitates the interaction between a computer and computer user by enhancing the user's ability to utilize application programs. The traditional interface between a human user and a typical personal computer is implemented with graphical displays and is generally referred to as a graphica...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/033A63F13/219G06F3/01G06K9/00G06K9/62G09G5/00G10L15/06H04N13/239
CPCG06F3/017G06K9/00355A63F13/06G06K9/00335A63F2300/1093G06F3/011A63F13/04G06F3/012G06F3/013G06T7/593G06T7/285H04N2013/0081H04N13/366H04N13/239H04N13/128A63F13/213A63F13/211A63F13/428A63F13/22G06V40/28A63F13/42G06V40/20G06F3/0346G06F3/038G06T7/20
Inventor WILSON, ANDREW D.OLIVER, NURIA M.
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products