Apparatus and method for indicating target by image processing without three-dimensional modeling

A target and image technology, applied in the field of image recognition, can solve problems such as assembly trouble

Inactive Publication Date: 2002-11-20
KONINKLIJKE PHILIPS ELECTRONICS NV
View PDF5 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This is cumbersome to assemble as it requires multiple cameras...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Apparatus and method for indicating target by image processing without three-dimensional modeling
  • Apparatus and method for indicating target by image processing without three-dimensional modeling
  • Apparatus and method for indicating target by image processing without three-dimensional modeling

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] refer to figure 1 , the user 30 indicates an object 25 located in / on a plane like a television or projection screen 10 or a wall (not shown). Combining the images from the two cameras 35 and 40 in the manner described below enables identification of the target location in either of the two cameras 35 and 40 . The illustration shows user 30 pointing at target 25 with a pointing gesture. It has been experimentally determined that the user's gesture for pointing at the target is: the user's fingertip, the user's right (left) eye and the target are in a straight line. This means that the planar projection of objects in the field of view of either camera follows the rectilinear planar projection defined by the user's eyes and fingertips. In the present invention, the two plane projections are transformed into common plane projections, which can be any one of the cameras 35 and 40 or any third plane.

[0037] Referring to FIG. 2 , the cameras are aimed so that they each ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Using a pair of cameras, the planar coordinates of the object the user is pointing at can be obtained without 3D modeling, and using data derived from an image instead of 3D scene data. Each camera looks at at least 4 recording points on the plane and an indicator along the direction where the target is located. A linear transformation of the first image maps the planar projection of the direction indication into the second image. In the second image, the coordinates of the object depend on the intersection of the projection directions of the second image and the transformed projection of the first image. In another embodiment and overall, the directions are mapped to a third reference frame or image by respective linear transformations. The system's app allows users to use fixed pointing gestures to indicate a location on a cast or TV screen. The system can be set up quickly because no information about the camera position is required.

Description

[0001] Related Application Cross Reference [0002] This application is related to the following applications, which are hereby incorporated by reference in their entirety: [0003] US Application 09 / 488,028 for "Multi-Model Video Object Acquisition and Redirection System and Method," filed 1 / 20 / 2000 [0004] US applications 09 / 532,820 and [0005] US Application 08 / 995,823 for "Method and System for Selectable Option-Based Gestures" filed 12 / 22 / 2000 technical field [0006] The present invention relates to image recognition, in particular the recognition of pointing gestures for pointing target points on a plane without three-dimensional modeling of the scene. Background technique [0007] There are many applications that can benefit from gesture recognition. For example, this is a natural mechanism for controlling the aiming of cameras. Another application is using gestures to move an on-screen cursor. For example, it is conceivable to make selections on future smart ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/033G06T1/00G06F3/00G06F3/01G06F3/0346G06F3/042G06T7/60
CPCA63F2300/1093G06F3/017G06F3/0304G06F3/0325
Inventor D·魏恩沙尔M·S·李
Owner KONINKLIJKE PHILIPS ELECTRONICS NV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products