Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Modular mobile connected pico projectors for a local multi-user collaboration

a mobile, multi-user technology, applied in the field of augmented or virtual reality systems, can solve the problems of limited support and high cost of large items

Inactive Publication Date: 2012-10-04
QUALCOMM INC
View PDF6 Cites 428 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent describes a method and system for projecting images onto a surface using two pico projectors that can recognize body parts and adjust the shape and orientation of the projected images accordingly. The system includes head or body mounted devices with cameras and displays, and a server for transmitting virtual objects to the devices. The technical effects of this invention include improved precision and accuracy in image projection, as well as improved user experience through the recognition of body parts and the ability to interact with virtual objects.

Problems solved by technology

Often such large items are very expensive and provide limited support for collaboration between users in remote locations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Modular mobile connected pico projectors for a local multi-user collaboration
  • Modular mobile connected pico projectors for a local multi-user collaboration
  • Modular mobile connected pico projectors for a local multi-user collaboration

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052]The various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the invention or the claims.

[0053]The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.

[0054]As used herein, the terms “mobile device” and “handheld device” refer to any one of cellular telephones, smartphones, tablet computers, personal data assistants (PDA's), wireless electronic mail receivers, multimedia Internet enabled cellular telephones, Global Positioning System (GPS) receivers, wireless gaming controllers, netbooks, and similar personal electr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The various embodiments include systems and methods for rendering images in a virtual or augmented reality system that may include capturing scene images of a scene in a vicinity of a first and a second projector, capturing spatial data with a sensor array in the vicinity of the first and second projectors, analyzing captured scene images to recognize body parts, and projecting images from each of the first and the second projectors with a shape and orientation determined based on the recognized body parts. Additional rendering operations may include tracking movements of the recognized body parts, applying a detection algorithm to the tracked movements to detect a predetermined gesture, applying a command corresponding to the detected predetermined gesture, and updating the projected images in response to the applied command.

Description

CROSS REFERENCE TO RELATED PATENT APPLICATIONS[0001]This patent application claims the benefit of priority to U.S. Provisional Patent Application Ser. No. 61 / 468,937 entitled “Systems and Methods for Gesture Driven Interaction for Digitally Augmented Physical Spaces” filed on Mar. 29, 2011, the entire contents of which are hereby incorporated by reference for all purposes.[0002]This patent application is also related to U.S. patent application Ser. No. ______ entitled “Anchoring Virtual Images to Real World Surfaces In Augmented Reality Systems” filed on ______, U.S. patent application Ser. No. ______ entitled “Cloud Storage Of Geotagged Maps” filed on ______, U.S. patent application Ser. No.______ entitled “Selective Hand Occlusion Over Virtual Projections onto Physical Surfaces Using Skeletal Tracking” filed on ______, U.S. patent application Ser. No. ______ entitled “System For The Rendering Of Shared Digital Interfaces Relative To Each User's Point Of View” filed on ______.FIELD...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G09G5/00
CPCG06F3/011G06F3/017G06F3/0425G06T15/503G06T2219/024H04N9/3173G06T2215/16G06F3/147G09G2354/00G06F3/167G06T19/006H04N2013/0081G06T19/00G09G5/00H04N5/74G06T17/05
Inventor MACIOCCI, GIULIANOEVERITT, ANDREW J.MABBUTT, PAULBERRY, DAVID T.
Owner QUALCOMM INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products