Unlock instant, AI-driven research and patent intelligence for your innovation.

Fusing, texturing, and rendering views of dynamic three-dimensional models

A dynamic, model-based technology, applied in image data processing, 3D modeling, 3D image processing, etc., to solve problems such as limitations, unreliable depth values, and incompleteness.

Active Publication Date: 2020-09-11
MICROSOFT TECH LICENSING LLC
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although such systems can produce high-fidelity dynamic 3D models in many scenes, the resulting dynamic 3D models may be inaccurate or incomplete when depth values ​​are unreliable or unavailable
Also, in some cases, previous methods of stitching texture details from different viewpoints onto dynamic 3D models can produce blurred details or obvious seams, especially for regions of interest such as faces, which can degrade the overall quality
Finally, previous methods of rendering views of dynamic 3D models are limited in terms of how artistic effects can be applied to the rendered views, especially for real-time applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fusing, texturing, and rendering views of dynamic three-dimensional models
  • Fusing, texturing, and rendering views of dynamic three-dimensional models
  • Fusing, texturing, and rendering views of dynamic three-dimensional models

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] When fusing depth maps to generate a dynamic three-dimensional ("3D") model of a computer-represented environment, when applying texture detail to a dynamic 3D model, or when rendering views of a textured dynamic 3D model, the Various methods can improve the quality of the results. For example, when fusing depth maps to generate a dynamic 3D model, the fusion component of the computer system also incorporates intrinsic texture values ​​(eg, color values) of the points of the dynamic 3D model. This can make the dynamic 3D model more accurate especially for areas where depth values ​​are unreliable or unavailable. As another example, when applying texture detail to a dynamic 3D model, the rendering component of the computer system applies smooth view-dependent texture weights to texture values ​​from different texture maps. Especially for regions of interest such as faces, this reduces blur and avoids introducing noticeable seams in the painted view. As another example,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Various approaches described herein improve the quality of results when fusing depth maps to generate dynamic three-dimensional ('3D') models, applying texture details to dynamic 3D models, or rendering views of textured, dynamic 3D models. For example, when fusing depth maps to generate a dynamic 3D model, a fusion component can also incorporate intrinsic color values for points of the dynamic 3Dmodel, potentially making the dynamic 3D model more accurate, especially for areas in which depth values are not reliable or not available. As another example, when applying texture details, a rendering component can apply smoothed, viewpoint-dependent texture weights to texture values from different texture maps, which can reduce blurring and avoid the introduction of noticeable seams. As another example, a rendering component can apply special effects indicated by metadata to rendered views, thereby allowing a content provider to assert artistic control over presentation.

Description

Background technique [0001] Virtual reality ("VR") technology simulates a user's physical presence in a synthetic computer-generated environment. Typically, a VR device includes a VR headset with a display screen and headphones to present realistic images and sounds for a computer-generated environment. With VR technology, users can look around a computer-generated environment and, in many cases, browse and interact with features of the environment. In some VR systems, a user can communicate with one or more other remote users who are virtually present in the environment. Similarly, augmented reality (“AR”) technology superimposes artificial, computationally-generated content on top of camera input and / or audio mixed. [0002] While some VR systems generate a synthetic environment from scratch, many VR and AR systems use devices that capture details of the physical environment to create a three-dimensional ("3D") model of the actual physical environment. For example, the p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/04G06T17/00
CPCG06T17/00G06T15/04
Inventor 杜若飞B·F·库特勒张文宇
Owner MICROSOFT TECH LICENSING LLC