Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Methods and systems for representing a pre-modeled object within virtual reality data

a technology of virtual reality data and methods, applied in the field of methods and systems for representing pre-modeled objects within virtual reality data, can solve the problems of inaccuracy entering the resultant virtual reality data, inaccuracy may be distracting users, and the rendered depictions of objects may become distorted, lost, or otherwise reproduced inaccurately,

Active Publication Date: 2019-12-12
VERIZON PATENT & LICENSING INC
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The patent describes methods and systems for accurately representing objects in virtual reality data. This involves accessing image datasets from different vantage points and combining them to create a realistic representation of a real-world scene. The invention can help to avoid inaccuracies and distortions in the virtual reality data caused by inconsistencies between different capture devices and the challenge of performing image processing in real-time. The technical effect of the patent is to provide users with a more immersive and accurate virtual reality experience.

Problems solved by technology

Unfortunately, as footage from different capture devices is combined and processed in real time, inaccuracies may enter into resultant virtual reality data.
For example, due to inconsistencies between data detected by different physical capture devices, as well as the significant challenge of performing large amounts of image processing in real time, rendered depictions of objects may become distorted, lost, replicated, moved, or otherwise reproduced inaccurately.
Such inaccuracies may be distracting to users viewing these inconsistent depictions during virtual reality experiences, particularly when the objects depicted are familiar objects or are important to the virtual reality experience.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Methods and systems for representing a pre-modeled object within virtual reality data
  • Methods and systems for representing a pre-modeled object within virtual reality data
  • Methods and systems for representing a pre-modeled object within virtual reality data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019]Methods and systems for representing a pre-modeled object within virtual reality data are described herein. For example, in certain implementations, a virtual reality system may access (e.g., receive, retrieve, load, transfer, etc.) a first image dataset and a second image dataset. An image dataset may be implemented by one or more files, data streams, and / or other types of data structures that contain data representative of an image (e.g., a two-dimensional image that has been captured or rendered in any of the ways described herein). The first image dataset accessed by the virtual reality system may be representative of a first captured image depicting a real-world scene from a first vantage point at a particular time, and the second image dataset may be representative of a second captured image depicting the real-world scene from a second vantage point distinct from the first vantage point at the particular time. These image datasets may be accessed from any suitable source...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An exemplary virtual reality system accesses first and second image datasets representative of first and second captured images depicting a real-world scene from first and second vantage points. The system recognizes a pre-modeled object within both the first and second captured images, and determines first and second confidence metrics representative of objective degrees to which the system accurately recognizes the pre-modeled object within the first and second captured images, respectively. The system further generates, a third image dataset representative of a rendered image based on the first and second image datasets. The rendered image includes a depiction of the pre-modeled object within the real-world scene from a third vantage point, and the generating comprises prioritizing, based on a determination that the second confidence metric is greater than the first confidence metric, the second image dataset over the first image dataset for the depiction of the pre-modeled object.

Description

BACKGROUND INFORMATION[0001]Virtual reality technology allows users of virtual reality media player devices to experience virtual reality worlds. For example, virtual reality worlds may be implemented based on live, camera-captured scenery of a real-world scene to allow users to experience, in real time, real-world places that are difficult, inconvenient, expensive, or otherwise problematic for the users to experience in real life (i.e., in a non-simulated manner). Virtual reality technology may thus provide users with a variety of entertainment, educational, vocational, and / or other enjoyable or valuable experiences that may otherwise be difficult or inconvenient for the users to obtain.[0002]In some examples, it may be desirable for a user to view a virtual reality world based on a real-world scene from a vantage point other than one of the vantage points from which real-time video footage of the real-world scene is being captured by physical capture devices (e.g., video cameras)....

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T19/00G06K9/00G06T7/246G06K9/46G06T15/50G06T17/00
CPCG06K9/00369G06T7/251G06K9/00221G06K9/00664G06T17/00G06T19/006G06T2207/20081G06T15/503G06K9/46G06T2207/10016G06K2209/25G06V20/64G06V20/20G06V20/10G06V40/16G06V40/103G06V2201/09
Inventor CASTANEDA, OLIVER S.LUO, LIANG
Owner VERIZON PATENT & LICENSING INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products