Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

System and method for creating interactive panoramic walk-through applications

Inactive Publication Date: 2011-09-01
LINDEMANN PIERRE ALAIN +2
View PDF14 Cites 161 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0034]According to another preferred embodiment of the present invention, preservation of visual perspective based on human vision is provided, notably by use of two points perspective that does not produce the trapezoidal distortion that is inherent with standard 360° environment interactive applications.
[0035]It is an object of the invention to provide a system and method for providing immersive, interactive and intuitive walk-through applications using 2D panoramic true images, virtual 3D images or a combination of both, that can provide seamless quality walk-through navigation and high quality imaging.
[0036]It is another object of the present invention to provide a system or method that combines high rate panoramic imaging broadcasting and the possibility of seamlessly providing higher quality images in which visual perspective based on human vision is preserved.
[0037]It is another object of the present invention to provide a system or method for creating and broadcasting interactive panoramic walk-through applications that can combine indoor and outdoor images, based on 2D panoramic images and virtual 3D images.
[0038]It is another object of the present invention to provide a system or method for creating and broadcasting interactive panoramic walk-through applications that can provide genuinely interactive functions accessible in the images.

Problems solved by technology

Current virtual tours photographic techniques suffer from several limitations.
Images captured from a single camera rotating on its nodal point can be stitched seamlessly but this solution can not be used for applications involving axial translation, where, for example, images are captured from a vehicle in motion.
This method consumes significant processing power and bandwidth for respectively correcting and transmitting the images whenever fast user motion is involved during navigation, and is therefore not optimal for providing a seamless navigation experience at relatively high user directed panning speed.
Limitations of current virtual tour technology, such as object occlusion, have had the detrimental result on virtual tours never materializing outside of the real estate industry.
These products are limited to outdoor views wherein any two consecutive points of view are positioned at a relatively long distance from each other.
Such panoramic images can not provide accurate representation of geometric objects, of for example buildings, due to the inherent discontinuity (break) of such panoramic images, this discontinuity is due to the physical impossibility of superposing a single nodal point from multiple cameras and view angles.
Furthermore, “STREET VIEW” products and the like provide images which suffer from trapezoidal distortion whenever the view angle is not pointing toward the horizon; this distortion is due to the perspective.
Although geometrically correct, “STREET VIEW”'s images do not reflect human vision behaviours, which keep vertical lines mostly parallel whenever a viewer tilts its view gently above or below the horizon.
Google “STREET VIEW” also creates ground plane distortion where planar ground seems to be inclined due to the unwanted motion of the cameras caused by the inertial force.
Other current walk-through products, such as “EVERYSCAPE” (www.everyscape.com by Everyscape, Waltham, Mass.) and “EARTHMINE” (www.earthmine.com by Earthmine inc., Berkeley, Calif.) also produce trapezoidal distortion, which makes them unfit for applications requiring continuous undistorted images (i.e. images which more closely correspond to human vision), as for example, for virtual shopping.
The trapezoidal distortion drawback is inherent also to virtual walk-through applications based on 3D virtual images, which can be used for example for visiting a virtual building using real-time 3D engine, such as Second Life (www.secondlife.com by Linden Research Inc, San Francisco, Calif.) or video games.
This product does not allow user to pan and tilt its viewing angle during its displacement along the travel path.
In sum, current virtual walk-through applications and systems suffer several important limitations.
Views suffer from high occlusion rate, as lots of objects are never visible at all along pathways.
The '232 solution is not optimized however for walk through applications allowing fast movement across the horizontal plane beyond 120°; moreover, the '232 patent does not disclose broadcasting images of different image resolution, meaning that it only covers broadcasting of image of the highest possible image resolution.
However, this method is not suited to the optimal transmission of full panoramic images in situations where user travels along predefined pathways consisting of several view points in a linear arrangement within a network of pathways within the walkthrough space.
Additionally, being view direction sensitive, this method is not optimized, in terms of response time, to allow user to change his travel plan, for example, by making a U-turn or to travel along another pathway.
Finally as this method allows travel in any direction (along a predefined pathway) the amount of data download to represent a given view point is greater and therefore less suited for fast and responsive viewing experience on the Internet or other network media having limited bandwidth.
Consequently, no system of the prior art provides a system optimized for seamless broadcasting of fluid motion where the user can orientate (pan and tilt) the field of view during motion and where the user can stop the motion anywhere along the travel path, in order to discover local objects in detail without occlusion.
Integration of virtual objects (such as images, icons, etc) to panoramas has been limited to the integration of two dimensional objects in specific view point images wherein said objects are not visible from distant view points.
Consequently, no prior art system provides a system and advanced features based on geographical information such as the ability to pin an element of information on any location in a view, such element staying spatially fixed to the point during the travel.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • System and method for creating interactive panoramic walk-through applications
  • System and method for creating interactive panoramic walk-through applications
  • System and method for creating interactive panoramic walk-through applications

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

Image Capture System

[0054]Referring now to FIGS. 1-4 and 8-9, an image capture system 20 consists of a panoramic optic 30, 30′, a camera 40 and a memory device 50 such as a computer, mounted on a vehicle 70 or other portable holding device.

[0055]The panoramic optic 30, 30′ is a physical panoramic optic providing 2D panoramic images. The optic 30, 30′ including either, a “lens and mirror” based optic system (catadioptric system) 32, 38, 42 as shown in FIG. 1, or a physical optical panoramic system 33 (consisting of an ultra wide angle lens or fisheye system with lens providing more than 200° of continuous vertical field of view), without mirror, as shown in FIG. 3. Both systems 30, 30′ are commercially available and reflect the substantially 360 degree panoramic field of view into the lens based optics connected to camera 40.

[0056]The mirror shape and lens used is specifically chosen and disposed such that the effective camera 40 maintains a single viewpoint. Such a lens is available...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

System and method of the invention provides for creating, storing and broadcasting interactive panoramic walk-through applications. The combination of images is determined by the user's choice of direction of displacement at each intersection point and from each view point or geographical coordinate, in order to provide a complete view from a first person's point of view. The system provides a visual perspective comparable to a human visual experience.

Description

[0001]This application claims the benefit of U.S. Provisional Application No. 61 / 111,346, entitled SYSTEM AND METHOD FOR CREATING AND BROADCASTING INTERACTIVE PANORAMIC WALK-THROUGH APPLICATIONS, filed Nov. 5, 2008.FIELD OF THE INVENTION[0002]The present invention relates generally to virtual tours. More specifically, the present invention relates to virtual walk-through applications using panoramic images, 3D images or a combination of both.BACKGROUND OF THE INVENTION[0003]A virtual tour (or virtual reality tour) is a virtual reality simulation of an existing location, which is usually built using contents consisting principally of 2D panoramic images, sequence of linked still images or video sequences, and / or image-based rendering (IBR) consisting of image-based models of existing physical locations, as well as other multimedia content such as sound effects, music, narration, and text. A virtual tour is accessed on a personal computer (typically connected to the Internet) or a mob...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N5/225G06T15/00
CPCG01C11/02G01S19/14G03B37/06G06F17/30241G06T3/0062G01S19/49G06F16/29G06T3/12
Inventor LINDEMANN, PIERRE-ALAINLINDEMANN, DAVIDCRITTIN, GERARD
Owner LINDEMANN PIERRE ALAIN
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products