Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual hair style modeling method of images and videos

A modeling method and hairstyle model technology are applied in the field of virtual character modeling and image and video editing, and can solve the problem of not being able to generate a physically reasonable three-dimensional hairstyle model.

Active Publication Date: 2014-02-26
ZHEJIANG UNIV
View PDF5 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method only fits the hairstyle area in the original image through a large number of spatial hair strands, and cannot generate a physically reasonable 3D hairstyle model, and the result is difficult to be directly used in relevant industrial applications

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual hair style modeling method of images and videos
  • Virtual hair style modeling method of images and videos
  • Virtual hair style modeling method of images and videos

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0095] The inventor has realized an implementation example of the present invention on a machine equipped with an Intel Xeon E5620 central processing unit and an NVidia GTX680 graphics processor. The inventors obtained all experimental results shown in the accompanying drawings using all parameter values ​​listed in the detailed description. All generated hair models consist of approximately 200,000 to 400,000 hairline curves, each individual hairline is represented as a line segment connected by multiple vertices, and the vertex colors are sampled from the original image, and these hairlines are rendered during the actual rendering process. Real-time rendering via geometry shaders expanded into screen-aligned polygon strips.

[0096] The inventor invited some users to test a prototype system of the method. The results show that users generally only need to draw about 2 to 8 simple strokes to interact with an ordinary portrait. For an image of approximately 500×800 size, the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a virtual hair style modeling method of images and videos. The method firstly uses digital equipment to perform data collection on a target figure, and obtains an area of a hair style part by division; then the direction ambiguity problem of an image hair style direction field is solved, a static hair style model which is uniformly distributed and satisfies the hair style area of an original image is obtained, the movement of the hair style in a video is calculated by tracking head model movement and estimating non-rigid deformation, and a dynamic hair style model of each moment in the movement process is generated to naturally fit to real movement of the hair style in the video; the method is used for performing virtual three-dimensional model reestablishment on the hair style of a figure in a single image and video sequence, and can be widely applied to creation of virtual roles in related field such as the digital media and hair style editing applications of a plurality of images and videos, such as portrait three-dimension, hair style movement simulation, interactive hair style editing and so on.

Description

technical field [0001] The invention relates to the field of virtual character modeling and image and video editing, in particular to a method for modeling virtual character hairstyles and editing hairstyles in image and video. Background technique [0002] The relevant technical background of the present invention is briefly described as follows: [0003] 1. Hairstyle modeling of virtual characters [0004] Although in the actual application of the industry, there are already many softwares that can assist creators to complete manual virtual hair modeling work, but they are often extremely complicated and time-consuming, requiring skilled skills and complicated manual operations, which will greatly prolong the creation of products Cycle and cost overhead (WARD, K., BERTAILS, F., KIM, T.-Y., MARSCHNER, S.R., CANI, M.-P., AND LIN, M.C. 2007. A survey on hair modeling: styling, simulation, and rendering. IEEE Transactions on Visualization and Computer Graphics 13, 2, 213–234...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T7/00
CPCG06T19/00G06T17/00G06T2200/08G06T2207/30196G06T7/75G06T7/251
Inventor 翁彦琳柴蒙磊王律迪周昆
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products