Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hair point cloud neural rendering method with stable time domain

A point cloud and stable technology, applied in the field of neural rendering and computer graphics, can solve the problems of slow rendering speed, high overhead, poor coloring effect, etc., to improve time domain stability, reduce rendering time, ensure rendering quality and rendering speed Effect

Pending Publication Date: 2022-05-27
ZHEJIANG UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional hair is generally expressed based on patches or curves, and rendered by rasterization or ray tracing, which may cause problems such as poor shading effect or slow rendering speed
And if you use an offline renderer to render a frame of high-quality hair, it usually takes half an hour to an hour, which is very expensive

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hair point cloud neural rendering method with stable time domain
  • Hair point cloud neural rendering method with stable time domain
  • Hair point cloud neural rendering method with stable time domain

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The time-domain stable hair point cloud neural rendering method of the present invention comprises the following steps:

[0027] Step 1: Use the ray tracing algorithm to generate a hair rendering data set, including local illumination results and global illumination results, build a basic neural rendering network according to the point cloud-based neural rendering method framework, including point cloud projection to obtain feature maps, use U-Net The network performs feature extraction and rendering on the feature map and uses a perceptual loss function as a constraint on the results of the network rendering.

[0028] This step is the technical basis of the present invention and is divided into the following sub-steps.

[0029] 1) Use Blender to make a hair model composed of hundreds of thousands of hairs, and generate 540 camera-surrounding local illumination for each model, and the global illumination rendering results have corresponding transparency masks. The resul...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a time-domain-stable hair point cloud neural rendering method, and provides a deep stripping and time-domain stability enhancement network for solving the problem that an existing neural rendering network based on point cloud cannot render high-quality hair with time-domain stability, and through hierarchical projection of an input point cloud model, the time-domain stability of the hair is enhanced. Feature information of different layers is obtained, and results are fused to adapt to the semitransparent feature of hair; and inputting the trained result into the time domain stability enhancement network, obtaining the dependency relationship between the current frame and the previous frames by using the re-projection of the point clouds between the adjacent frames, and generating the final result of the current frame, thereby ensuring the time domain stability of the training result. According to the method, the result of global illumination is inferred through the information of local illumination, and the rendering quality and the rendering speed can be ensured at the same time. The defect that a traditional method uses global illumination algorithms such as ray tracing to render hair is very time-consuming is overcome.

Description

technical field [0001] The invention relates to the field of computer graphics and the field of neural rendering, in particular to a time-domain stable hair point cloud neural rendering method. Background technique [0002] Hair plays a vital role in the photorealistic representation of computer-generated characters and animations. In the film industry, for example, in the movies "Godzilla vs. King Kong" and "Zootopia", there are abundant hairs around the animal bodies. Immersive and immersive experience. Also in the gaming industry, the high-quality hair that a large number of characters in "Sekiro" has is very important for gamers to get a better gaming experience page. [0003] Therefore, how to generate high-quality hair is a very important issue. Traditional hair is generally expressed using a shape based on patches or curves, and is rendered by rasterization or ray tracing, which may cause problems of poor shading effect or slow rendering speed. And if you use an o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/00G06T15/06G06T15/50G06N3/04G06N3/08
CPCG06T15/005G06T15/06G06T15/506G06N3/084G06T2207/10028G06T2207/20081G06T2207/20084G06N3/045
Inventor 潘曲利叶可扬任重
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products