Face tracking for additional modalities in spatial interaction

A spatial interaction, facial technology, applied in the input/output of user/computer interaction, mechanical mode conversion, computer components, etc.

Active Publication Date: 2016-06-15
QUALCOMM INC
View PDF4 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Using these and other user devices in augmente

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face tracking for additional modalities in spatial interaction
  • Face tracking for additional modalities in spatial interaction
  • Face tracking for additional modalities in spatial interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] For example, the techniques described herein include mechanisms for interaction in an augmented reality environment between a screen side of a user device and a camera side of the user device. As used herein, the term "augmented reality" means any environment that combines real-world images with computer-generated data and superimposes graphics, audio, and other sensory input onto the real world.

[0020] Operating a user device in an augmented reality environment can be challenging because, among other operations, the user device should be spatially aligned toward the augmented reality scene. However, because the user holds the user device with both hands, there are limited input modalities for the user device, since on-screen menus, tabs, widgets need to be accessed using the user's hands.

[0021] In one aspect, a camera in a user device receives a constant image stream from a user side (or front) of the user device and an image stream from a target side (or rear) of...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A user device receives an image stream from the user side of the user device and an image stream from a target side of the user device. The user device acquires a coordinate system for the user, acquires its own coordinate system, and relates the two coordinate systems to a global coordinate system. The user device then determines whether the user has moved and/or whether the user device has moved. Movement of the user and/or the user device is used as input modalities to control the user's interactions in the augmented reality environment.

Description

[0001] Claiming priority under 35 U.S.C. §119 [0002] This patent application claims priority to Provisional Application No. 61 / 902,025, filed November 8, 2013 by the same inventor of this application, entitled "SPATIAL INTERACTION USING FACE TRACKING," which provisional application The case is assigned to its assignee and is hereby expressly incorporated by reference in its entirety. Background technique [0003] Spatial interaction using handheld user devices is becoming more popular as a significant number of users choose it as a point-and-shoot device. However, use of these user devices typically requires the user to hold the user device with two-handed grips, even for lightweight user devices, such as tablet computers, phablet phones, smart phones, and the like. Using these and other user devices in an augmented reality environment can even be challenging. Contents of the invention [0004] In general, one implementation of the subject matter disclosed herein includ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/01G06F3/0481G06T19/00
CPCG06F3/012G06F3/04815G06F3/013G06V40/176G06V40/19G06F3/011
Inventor 哈特穆特·赛西特
Owner QUALCOMM INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products