Patents
Literature
Hiro is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Hiro

4026 results about "Gesture recognition" patented technology

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs (graphical user interfaces), which still limit the majority of input to keyboard and mouse and interact naturally without any mechanical devices. Using the concept of gesture recognition, it is possible to point a finger at this point will move accordingly. This could make conventional input on devices such and even redundant.

Video hand image-three-dimensional computer interface with multiple degrees of freedom

A video gesture-based three-dimensional computer interface system that uses images of hand gestures to control a computer and that tracks motion of the user's hand or a portion thereof in a three-dimensional coordinate system with ten degrees of freedom. The system includes a computer with image processing capabilities and at least two cameras connected to the computer. During operation of the system, hand images from the cameras are continually converted to a digital format and input to the computer for processing. The results of the processing and attempted recognition of each image are then sent to an application or the like executed by the computer for performing various functions or operations. When the computer recognizes a hand gesture as a "point" gesture with one or two extended fingers, the computer uses information derived from the images to track three-dimensional coordinates of each extended finger of the user's hand with five degrees of freedom. The computer utilizes two-dimensional images obtained by each camera to derive three-dimensional position (in an x, y, z coordinate system) and orientation (azimuth and elevation angles) coordinates of each extended finger.
Owner:WSOU INVESTMENTS LLC +1

Gesture-based computer interface

A system and method for manipulating virtual objects in a virtual environment, for drawing curves and ribbons in the virtual environment, and for selecting and executing commands for creating, deleting, moving, changing, and resizing virtual objects in the virtual environment using intuitive hand gestures and motions. The system is provided with a display for displaying the virtual environment and with a video gesture recognition subsystem for identifying motions and gestures of a user's hand. The system enables the user to manipulate virtual objects, to draw free-form curves and ribbons and to invoke various command sets and commands in the virtual environment by presenting particular predefined hand gestures and / or hand movements to the video gesture recognition subsystem.
Owner:LUCENT TECH INC

Method and system for gesture category recognition and training using a feature vector

A computer implemented method and system for gesture category recognition and training. Generally, a gesture is a hand or body initiated movement of a cursor directing device to outline a particular pattern in particular directions done in particular periods of time. The present invention allows a computer system to accept input data, originating from a user, in the form gesture data that are made using the cursor directing device. In one embodiment, a mouse device is used, but the present invention is equally well suited for use with other cursor directing devices (e.g., a track ball, a finger pad, an electronic stylus, etc.). In one embodiment, gesture data is accepted by pressing a key on the keyboard and then moving the mouse (with mouse button pressed) to trace out the gesture. Mouse position information and time stamps are recorded. The present invention then determines a multi-dimensional feature vector based on the gesture data. The feature vector is then passed through a gesture category recognition engine that, in one implementation, uses a radial basis function neural network to associate the feature vector to a pre-existing gesture category. Once identified, a set of user commands that are associated with the gesture category are applied to the computer system. The user commands can originate from an automatic process that extracts commands that are associated with the menu items of a particular application program. The present invention also allows user training so that user-defined gestures, and the computer commands associated therewith, can be programmed into the computer system.
Owner:ASSOCIATIVE COMPUTING +1

Gesture-controlled interfaces for self-service machines and other applications

A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body / object from the rest of the image, describing that object, and identifying that description. The disclosure details methods for gesture recognition, as well as the overall architecture for using gesture recognition to control of devices, including self-service machines.
Owner:JOLLY SEVEN SERIES 70 OF ALLIED SECURITY TRUST I

Tracking and gesture recognition system particularly suited to vehicular control applications

A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and / or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio / CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle. The on-board sensor system would be used to track the driver or passenger, but when the algorithms produce a command for a desired response, that response (or just position and gesture information) could be transmitted via various methods (wireless, light, whatever) to other systems outside the vehicle to control devices located outside the vehicle. For example, this would allow a person to use gestures inside the car to interact with a kiosk located outside of the car.
Owner:JOLLY SEVEN SERIES 70 OF ALLIED SECURITY TRUST I

Mobile devices with motion gesture recognition

InactiveUS20090265671A1Flexible and varied and robust and accurate recognitionReduce processInput/output for user-computer interactionDevices with sensorAccelerometerOperation mode
Mobile devices using motion gesture recognition. In one aspect, processing motion to control a portable electronic device includes receiving, on the device, sensed motion data derived from motion sensors of the device and based on device movement in space. The motion sensors include at least three rotational motion sensors and at least three accelerometers. A particular operating mode is determined to be active while the movement of the device occurs, the mode being one of multiple different operating modes of the device. Motion gesture(s) are recognized from the motion data from a set of motion gestures available for recognition in the active operating mode. Each of the different operating modes, when active, has a different set of gestures available. State(s) of the device are changed based on the recognized gestures, including changing output of a display screen on the device.
Owner:INVENSENSE

Gesture recognition apparatus, gesture recognition method, and gesture recognition program

A gesture recognition apparatus for recognizing postures or gestures of an object person based on images of the object person captured by cameras. The gesture recognition apparatus includes: a face / fingertip position detection means which detects a face position and a fingertip position of the object person in three-dimensional space based on contour information and human skin region information of the object person to be produced by the images captured; and a posture / gesture recognition means which operates to detect changes of the fingertip position by a predetermined method, to process the detected results by a previously stored method, to determine a posture or a gesture of the object person, and to recognize a posture or a gesture of the object person.
Owner:HONDA MOTOR CO LTD

Touch gesture based interface for motor vehicle

An improved apparatus and method is provided for operating devices and systems in a motor vehicle, while at the same time reducing vehicle operator distractions. One or more touch sensitive pads are mounted on the steering wheel of the motor vehicle, and the vehicle operator touches the pads in a pre-specified synchronized pattern, to perform functions such as controlling operation of the radio or adjusting a window. At least some of the touch patterns used to generate different commands may be selected by the vehicle operator. Usefully, the system of touch pad sensors and the signals generated thereby are integrated with speech recognition and / or facial gesture recognition systems, so that commands may be generated by synchronized multi-mode inputs.
Owner:WAYMO LLC

Gesture recognition method and touch system incorporating the same

A gesture recognition method includes detecting multiple pointers in close proximity to a touch surface to determine if the multiple pointers are being used to perform a known gesture. When the multiple pointers are being used to perform a known gesture, executing a command associated with the gesture. A touch system incorporating the gesture recognition method is also provided.
Owner:PIXART IMAGING INC

User interface apparatus using hand gesture recognition and method thereof

Provided is a user interface apparatus that can control a telematics terminal safely and comfortably while driving, by recognizing a hand gesture image received through a camera in the telematics terminal as a corresponding control signal, and a method thereof. The interface apparatus using a hand gesture recognition, includes: an input receiving block for receiving a command registration request signal and a command selection signal; a hand gesture recognizing block for storing the hand gesture image in connection with a specific command, and transforming the hand gesture image into the corresponding command by recognizing the hand gesture image from the image obtained in the image obtaining block; and a command performing block for performing an operation corresponding to a command transformed in the hand gesture recognizing block.
Owner:INTELLECTUAL DISCOVERY CO LTD

Gesture activated home appliance

Apparatus for operating home appliances using gesture recognition are disclosed. An image receiver receives a continuous stream of images of a gesture made within the image receiver's field of view. An image processor is connected to the image receiver for sampling the received stream of gesture images to form a gesture image sequence, and for recognizing whether the gesture image sequence corresponds to one of a predefined set of gestures. If the input gesture is recognized as being one of the predefined gestures, an operations processor, which is connected to the image processor, identifies a home appliance operation associated with the recognized gesture. An appliance controller is connected to the operations processor for causing the predefined home appliance operation to be performed. Home appliances and methods for operating home appliances using gesture recognition are also disclosed.
Owner:BELLSOUTH INTPROP COR

Object position detector with edge motion feature and gesture recognition

A method of generating a signal comprising providing a capacitive touch sensor pad including a matrix of X and Y conductors, developing capacitance profiles in one of an X direction and a Y direction from the matrix of X and Y conductors, determining an occurrence of a single gesture through an examination of the capacitance profiles, the single gesture including an application of at least two objects on the capacitive touch sensor pad, and generating a signal indicating the occurrence of the single gesture.
Owner:SYNAPTICS INC

Hand gesture recognition input system and method for a mobile phone

A handheld gesture recognition control apparatus and its method are provided for a mobile phone. The input method of the present invention includes collecting a plurality of images; storing the images as control images; mapping the control images to corresponding control commands; capturing an image taken by a camera as a current image; comparing the current image to the control images; selecting one of the control images as a target control image according to a comparison result; extracting a control command mapped to the target control image; and executing the control command.
Owner:SAMSUNG ELECTRONICS CO LTD

Gesture recognition method and touch system incorporating the same

A gesture recognition method includes detecting multiple pointers in close proximity to a touch surface to determine if the multiple pointers are being used to perform a known gesture. When the multiple pointers are being used to perform a known gesture, executing a command associated with the gesture. A touch system incorporating the gesture recognition method is also provided.
Owner:PIXART IMAGING INC

Gesture recognition system

A gesture recognition system includes: elements for detecting and generating a signal corresponding a number of markers arranged on an object, elements for processing the signal from the detecting elements, members for detecting position of the markers in the signal. The markers are divided into first and second set of markers, the first set of markers constituting a reference position and the system comprises elements for detecting movement of the second set of markers and generating a signal as a valid movement with respect to the reference position.
Owner:CREDOBLE

Gesture recognition using plural sensors

Apparatus comprises a processor; a user interface enabling user interaction with one or more software applications associated with the processor; first and second sensors configured to detect, and generate signals corresponding to, objects located within respective first and second sensing zones remote from the apparatus, wherein the sensors are configured such that their respective sensing zones overlap spatially to define a third, overlapping, zone in which both the first and second sensors are able to detect a common object; and a gesture recognition system for receiving signals from the sensors, the gesture recognition system being responsive to detecting an object inside the overlapping zone to control a first user interface function in accordance with signals received from both sensors.
Owner:NOKIA TECHNOLOGLES OY

Audible list traversal

Many embodiments may comprise logic such as hardware and / or code to implement user interface for traversal of long sorted lists, via audible mapping of the lists, using sensor based gesture recognition, audio and tactile feedback and button selection while on the go. In several embodiments, such user interface modalities are physically small in size, enabling a user to be truly mobile by reducing the cognitive load required to operate the device. For some embodiments, the user interface may be divided across multiple worn devices, such as a mobile device, watch, earpiece, and ring. Rotation of the watch may be translated into navigation instructions, allowing the user to traverse the list while the user receives audio feedback via the earpiece to describe items in the list as well as audio feedback regarding the navigation state. Many embodiments offer the user a simple user interface to traverse the list without visual feedback.
Owner:INTEL CORP

Wireless control device

A wireless control device includes a small, lightweight housing worn by an operator, for example on the operator's wrist, and a controlled device, for example a personal computer. Several optical emitters, preferably light emitting diodes operating in the infrared range, and several optical detectors are provided on the housing. At least one x-axis emitter-detector pair operates to detect an x-direction of a pointing motion or gesture, and at least one y-axis emitter-detector pair operates to detect a y-direction of a pointing motion or gesture. This motion can then be used to cause a response in the controlled device. For example, angles of the operator's hand at the wrist can be interpreted to induce motion of a cursor on a computer display. The device may also include a motion sensor, an environmental condition sensor, or a voice recognition sensor, and can also be adapted for gesture recognition and image scanning applications.
Owner:HARMONIC RES

Gesture Recognition

A state machine gesture recognition algorithm for interpreting streams of coordinates received from a touch sensor. The gesture recognition code can be written in a high level language such as C and then compiled and embedded in a microcontroller chip, or CPU chip as desired. The gesture recognition code can be loaded into the same chip that interprets the touch signals from the touch sensor and generates the time series data, e.g. a microcontroller, or other programmable logic device such as a field programmable gate array.
Owner:SOLAS OLED LTD

Gesture recognizer system architicture

Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.
Owner:MICROSOFT TECH LICENSING LLC

Handheld computer systems and techniques for character and command recognition related to human movements

Systems and methods for human hand gesture recognition through a training mode and a recognition mode are disclosed. In the training mode, a user can move a handheld device with a hand gesture intended to represent a command. Sensors within the handheld device can record raw data, which can be processed to obtain a set of values corresponding to a set of discrete features, which is stored in a database and associated with the intended command. The process is repeated for various hand gestures representing different commands. In the recognition mode, the user can move the handheld device with a hand gesture. A computer system can compare a set of values corresponding to a set of discrete features derived from the hand gesture with the sets of values stored in the database, select a command with the closest match and displays and / or executes the command.
Owner:INVENSENSE

Recognizing input gestures

The present invention extends to methods, systems, and computer program products for recognizing input gestures. A neural network is trained using example inputs and backpropagation to recognize specified input patterns. Input gesture data is representative of movements in contact on a multi-touch input display surface relative to one or more axes over time. Example inputs used for training the neural network to recognize a specified input pattern can be created from sampling input gesture data for example input gestures known to represent the specified input pattern. Trained neural networks can subsequently be used to recognize input gestures that are similar to known input gestures as the specified input pattern corresponding to the known input gestures.
Owner:MICROSOFT TECH LICENSING LLC

Touch gesture based interface for motor vehicle

An improved apparatus and method is provided for operating devices and systems in a motor vehicle, while at the same time reducing vehicle operator distractions. One or more touch sensitive pads are mounted on the steering wheel of the motor vehicle, and the vehicle operator touches the pads in a pre-specified synchronized pattern, to perform functions such as controlling operation of the radio or adjusting a window. At least some of the touch patterns used to generate different commands may be selected by the vehicle operator. Usefully, the system of touch pad sensors and the signals generated thereby are integrated with speech recognition and / or facial gesture recognition systems, so that commands may be generated by synchronized multi-mode inputs.
Owner:WAYMO LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products