User interface control using impact gestures
A user interface and gesture technology, applied to the input/output of user/computer interaction, devices with sensors, structural components of portable computers, etc., can solve problems such as incomplete support, system inapplicability affecting gestures, etc.
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0017] In one aspect, the present invention provides methods and systems for processor control of a user interface of a wearable computer or a device connected to the wearable computer. The method and system monitors events received from the wearable computer's sensors or devices connected to the wearable computer, providing a method for controlling the media. A machine learning process is adapted to analyze the detected event and determine whether the detected event is a predefined influencing gesture. When it is determined that the detected event is a predefined influencing gesture, the processor is configured to execute a predefined response corresponding to the predefined influencing gesture in the user interface.
[0018] In one aspect, a method for processor control of a user interface of a wearable computer or a device connected to the wearable computer is disclosed. The method includes receiving a three-dimensional set of values characterizing linear acceleration, i...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


