Cloud storage of geotagged maps

a geotagged map and cloud technology, applied in the field of augmented or virtual reality systems, can solve the problems of limited support and high cost of large items

Inactive Publication Date: 2012-10-04
QUALCOMM INC
View PDF2 Cites 73 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Often such large items are very expensive and provide limited support for collaboration between users in remote locations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cloud storage of geotagged maps
  • Cloud storage of geotagged maps
  • Cloud storage of geotagged maps

Examples

Experimental program
Comparison scheme
Effect test

third embodiment

[0112]In an embodiment, the processor within or coupled to the head mounted device 10a may capture an image with a head mounted or body mounted camera or cameras which may be full-color video cameras. Distances to objects within the imaged scene may be determined via trigonometric processing of two (or more) images obtained via a stereo camera assembly. Alternatively or in addition, the head mounted device may obtain spatial data (i.e., distances to objects in the images) using a distance sensor which measures distances from the device to objects and surfaces in the image. In an embodiment, the distance sensor may be an infrared light emitter (e.g., laser or diode) and an infrared sensor. In this embodiment, the head mounted device may project infrared light as pulses or structured light patterns which reflect from objects within the field of view of the device's camera. The reflected infrared laser light may be received in a sensor, and spatial data may be calculated based on a mea...

embodiment

[0222 method 2100 enables collaboration and sharing resources to minimize an amount of processing performed by the head mounted devices themselves. In method 2100, the processor may commence operation by receiving an input request to collaborate from a first head mounted device that may be running an application in block 2101. In block 2102, the processor may initiate a peer to peer search for near devices for collaboration. In determination block 2103, the processor may determine whether to collaborate with discovered devices. If so (i.e., determination block 2103=“Yes”), the processor may create a connection between the devices. The processor may collaborate using a two way communication link. The communication link may be formed between the first and the second head mounted devices 10 and 10b in block 2104.

[0223]The processor may access a directory in a server. Processor utilizing the directory may determine if other users are available for collaboration in block 2105 by scanning...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A head mounted device provides an immersive virtual or augmented reality experience for viewing data and enabling collaboration among multiple users. Rendering images in a virtual or augmented reality system may include generating data regarding locations of surfaces and objects in a scene based on images and spatial data gathered from a first body mounted sensor device, generating a three dimensional map of the scene based on the generated data, adding geographical identification metadata to the three dimensional map of the scene, storing the geographical identification metadata and three dimensional map in a memory, and transmitting at least a portion of the geographical identification metadata and three dimensional map to a second body mounted sensor device.

Description

CROSS REFERENCE TO RELATED PATENT APPLICATIONS[0001]This patent application claims the benefit of priority to U.S. Provisional Patent Application Ser. No. 61 / 468,937 entitled “Systems and Methods for Gesture Driven Interaction for Digitally Augmented Physical Spaces” filed on Mar. 29, 2011, the entire contents of which are hereby incorporated by reference for all purposes.[0002]This patent application is also related to U.S. patent application Ser. No. ______ entitled “Modular Mobile Connected Pico Projectors For A Local Multi-User Collaboration” filed on ______, U.S. patent application Ser. No. ______ entitled “Anchoring Virtual Images to Real World Surfaces In Augmented Reality Systems” filed on ______, U.S. patent application Ser. No. ______ entitled “Selective Hand Occlusion Over Virtual Projections onto Physical Surfaces Using Skeletal Tracking” filed on ______, and U.S. patent application Ser. No. ______ entitled “System For The Rendering Of Shared Digital Interfaces Relative ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): G06T15/00
CPCG06F3/011G06F3/017G06F3/0425G06T15/503G06T2215/16G06T19/006G06T2219/024H04N9/3173G06T17/05G06F3/147G09G2354/00G06F3/167G06T19/00G09G5/00H04N5/74
Inventor MACIOCCI, GIULIANOEVERITT, ANDREW J.MABBUTT, PAULBERRY, DAVID T.
Owner QUALCOMM INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products