Gesture recognition using plural sensors

A gesture recognition and sensor technology, applied in the field of gesture recognition, can solve the problems of limited ability to interpret three-dimensional gestures, difficult gestures for camera movement, etc.

Inactive Publication Date: 2014-01-08
NOKIA TECHNOLOGLES OY
View PDF9 Cites 82 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, cameras tend to have a limited optical sensing area, or field of view, and, because of the way they operate, they have difficulty interpreting certain gestures, especially those involving movement toward or away from the camera
So the ability to interpret three-dimensional gestures is very limited

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gesture recognition using plural sensors
  • Gesture recognition using plural sensors
  • Gesture recognition using plural sensors

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] Embodiments described herein include devices or terminals, especially communication terminals, that use supplemental sensors to provide information characterizing the environment surrounding the terminal. Specifically, the sensor provides information to identify gestures, where the information is processed to identify an object in the sensor's corresponding sensing region and the object's motion.

[0040] Depending on whether an object is detected by only one sensor or by two sensors, a corresponding command or set of commands is used to control user interface functions of the terminal, eg to control some aspects of the terminal's operating system or applications associated with the operating system. Information corresponding to objects detected by only one sensor is processed to execute a first command or first set of commands, whereas information corresponding to objects detected by two or more sensors is processed to execute a second command or Second command group. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Apparatus comprises a processor; a user interface enabling user interaction with one or more software applications associated with the processor; first and second sensors configured to detect, and generate signals corresponding to, objects located within respective first and second sensing zones remote from the apparatus, wherein the sensors are configured such that their respective sensing zones overlap spatially to define a third, overlapping, zone in which both the first and second sensors are able to detect a common object; and a gesture recognition system for receiving signals from the sensors, the gesture recognition system being responsive to detecting an object inside the overlapping zone to control a first user interface function in accordance with signals received from both sensors.

Description

technical field [0001] The present invention relates generally to gesture recognition, and particularly but not exclusively to recognizing gestures detected by a first sensor and a second sensor of a device or terminal. Background technique [0002] It is known to use video data received by a camera of a communication terminal to enable a user to control applications associated with the terminal. The application stores a mapping between predetermined user gestures detected using the camera and one or more commands associated with the application. For example, a known photo browsing application allows controlling how photos are displayed on the user interface using waving gestures performed in front of the terminal's front camera, a right-to-left gesture generally causing the application to advance in the order of the photos. [0003] However, cameras tend to have a limited optical sensing area or field of view, and because of the way they operate, they have difficulty inter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06F3/0354G06F3/0487G01S17/87G06K9/00G06V10/70G06V20/40G06V20/64G06V40/10G06V40/20
CPCG06F3/0488G06F3/017G06F2203/04101G06F2203/04106
Inventor 汪孔桥J·P·J·奥利凯南
Owner NOKIA TECHNOLOGLES OY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products