Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Guided hair extraction method based on motion similarity

An extraction method and similarity technology, applied in the field of dynamic simulation and editing of hair, can solve problems such as unintuitive modeling process and difficulty in directly obtaining hair movement results, and achieve fast editing speed and high fidelity effect

Inactive Publication Date: 2019-08-23
SHANDONG UNIV OF SCI & TECH
View PDF1 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These methods control the modeling results by controlling the boundary conditions. The modeling process is not intuitive, and it is difficult to directly obtain the corresponding hair motion results
The main difficulty faced by current hair editors is to initialize the hair model accurately with simple hair geometry

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Guided hair extraction method based on motion similarity
  • Guided hair extraction method based on motion similarity

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0054] Below in conjunction with accompanying drawing and specific embodiment the present invention is described in further detail:

[0055] like figure 1 As shown, a guided hair extraction method based on motion similarity includes the following steps:

[0056] Step 1: Input hair motion data and build a graphical model to reflect the local motion similarity of hair strands.

[0057] This step consists of two processes:

[0058] One is to establish the basic topology of the graph, and the other is to define weights to measure motion similarity.

[0059] Among them, the specific process of establishing the basic topology structure of the graph is as follows:

[0060] I. Build a graph for all hairs at the level of hair particles rather than at the level of hair strands. Each node in the graph represents a hair particle, and the edges of the graph are initialized using the proximity between hair particles.

[0061] II. Define p i (f) represents the position of the hair parti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of dynamic simulation and editing of hair, and particularly discloses a guided hair extraction method based on motion similarity. The guided hair extraction method comprises the following steps: s1, inputting hair movement data, and establishing a graph model to reflect the local movement similarity of hair; s2, dividing the graph model into k disjointsets by defining a minimum energy function, and performing hair hairline grouping; and s3, extracting the guide hair filaments from the hair filament group by using a hair filament selection algorithmfor selecting the maximum energy function. According to the method, the motion similarity among the hairs is utilized to extract and guide the hairs for establishing the space-time neighborhood network information, so that the method is applied to dynamic hair space-time editing and has the advantages of high editing speed, high effect fidelity and the like.

Description

technical field [0001] The invention belongs to the technical field of dynamic simulation and editing of hair (that is, editing hair motion data, extracting guiding hair, and applying to hair spatio-temporal editing), in particular to a method for extracting guiding hair based on motion similarity. Background technique [0002] Hair editing is one of the important research contents in the fields of computer graphics and virtual reality. At present, there are mainly two ways to implement hair editing. One is the hair model editing method based on geometry, that is, generating new hairstyles by editing the appearance of the hair model; The model is edited. [0003] The principles and defects of the above two hair editing methods are analyzed respectively as follows: [0004] 1. Geometry-based hair model editing method [0005] Editing and generating hairstyles based on sketches is an early common hair model editing method. Editing hair models by setting different parameter ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/20
CPCG06T19/20
Inventor 包永堂崔宾阁梁永全李哲
Owner SHANDONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products