Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hair reconstruction method based on adaptive octree hair convolutional neural network

A convolutional neural network and adaptive technology, applied in the field of 3D reconstruction, can solve the problems of consuming large 3D convolution modules, wasting storage space, and increasing computing overhead

Active Publication Date: 2020-08-25
SOUTH CHINA UNIV OF TECH
View PDF3 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, generating high-resolution volume data requires a large number of 3D convolution modules, and as the resolution increases, the computational overhead increases exponentially
[0004] Due to the slender and special structure of the hair, there are large empty areas in the hair direction field, that is, there are many elements whose direction vectors are zero vectors. These empty elements not only waste storage space, but also bring redundant calculation overhead to the hair reconstruction network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hair reconstruction method based on adaptive octree hair convolutional neural network
  • Hair reconstruction method based on adaptive octree hair convolutional neural network
  • Hair reconstruction method based on adaptive octree hair convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] The present invention will be further described below in conjunction with specific examples.

[0044] Such as Figure 1 to Figure 3 As shown, the hair reconstruction method based on the adaptive octree hair convolutional neural network provided in this embodiment specifically includes the following steps:

[0045] 1) Prepare training data. It was further processed using the publicly available hair model database (USC-HairSalon database) as the data base. The database contains 343 hair models, each model is composed of 10,000 hair strands, each hair strand is composed of 100 points, and all models have been aligned to the same head template. First, we rasterize the hairline direction of each 3D model in the database to form a hairline direction map. Specifically, the direction of the hair strands is mapped to different colors, and the colors are attached to the triangles of the hair mesh. At the same time, prepare an upper body body mesh for the hair mesh, and send i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a hair reconstruction method based on an adaptive octree hair convolutional neural network. The method comprises the following steps: 1) constructing a training set by using apublic hair database, converting a three-dimensional hair model in the database into a hair direction field taking a self-adaptive octree as a structure, and generating a corresponding hair directional diagram, a hair region diagram and a body region diagram; 2) designing a self-adaptive octree hair convolutional neural network to reconstruct a hair direction field; 3) preprocessing the input picture; and 4) carrying out post-processing on the output direction field, converting the direction field generated by the adaptive octree hair convolutional neural network into a hair model representedby hair, and completing hair reconstruction. According to the method, the compact structure of the octree is utilized, the storage space of the hair direction field is greatly compressed, and meanwhile, the memory overhead and the running time of a hair reconstruction network are reduced. According to the invention, the three-dimensional hair structure in the input picture can be effectively identified and recovered, and the contour appearance and texture trend consistent with those of the input hair style are ensured.

Description

technical field [0001] The invention relates to the technical field of three-dimensional reconstruction, in particular to a hair reconstruction method based on an adaptive octree hair convolutional neural network. Background technique [0002] Obtaining the 3D mesh representation of the human body surface has always been one of the important technologies in the fields of human-computer interaction, virtual reality, 3D games, animation, and movie special effects production. As hair is an important part of the human body, the importance of hair reconstruction technology is self-evident. In applications such as virtual reality, restoring realistic 3D hair models will greatly enhance the immersion of virtual environment applications and improve user comfort; the same is true in the fields of 3D games and animation film production, high-precision hair reconstruction technology will greatly enhance Realism of 3D characters. At the same time, hair reconstruction also has importan...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06N3/04G06N3/08
CPCG06T17/005G06N3/08G06N3/045Y02D10/00
Inventor 叶泽豪李桂清
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products