Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Hairstyle Replacement Method Based on Generative Adversarial Network Model

A replacement method and hairstyle technology, applied in biological neural network models, neural learning methods, graphics and image conversion, etc., can solve problems that do not involve hairstyle features

Active Publication Date: 2021-06-04
FUDAN UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The model part of the present invention is based on the image GAN method transfer network of the above-mentioned papers to extract facial features, but each paper only extracts a specific facial feature, and does not involve hairstyle features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Hairstyle Replacement Method Based on Generative Adversarial Network Model
  • A Hairstyle Replacement Method Based on Generative Adversarial Network Model
  • A Hairstyle Replacement Method Based on Generative Adversarial Network Model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Step 1. Collection of hairstyle pictures and labeling of attribute categories;

[0027] Step 2. Face detection, cut out the location of the face, which is convenient for the next step of processing by the deep learning neural network;

[0028] Step 3. Build a deep neural network. figure 2 The structure of the designed deep neural network;

[0029] Step 4. Train the deep neural network. After preparing the face picture and the corresponding attribute information, the training of the deep network is carried out. After the input training image is encoded by the neural network, the hidden vector is obtained. According to this latent vector and the specified hairstyle attributes, the generated picture is obtained from the decoding network. For the generated picture, it is sent to the discriminant network and the recognition network to learn the attributes corresponding to the generated picture, and the feature map of the discriminant network is used as the reconstruction...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of computer image processing, in particular to a hairstyle replacement method based on a generative confrontational network model. Automatically changing hairstyles has many practical applications in the field of classification and image editing. This invention appeals to this requirement by first providing a new large-scale hairstyle dataset, HAIRSTYLE30k, which contains 64 different types of hairstyles composed of 30K images. At the same time, Provide a model H‑GAN that automatically generates and modifies hairstyles to achieve automatic hairstyle replacement. The invention improves the basic generative confrontational network model, can efficiently learn new data sets, not only performs well on basic data sets, but also has good generalization on new data sets.

Description

technical field [0001] The invention belongs to the technical field of computer image processing, and in particular relates to a hairstyle replacement method based on a generative confrontational network model. Background technique [0002] Hairstyles can express your personality, confidence and attitude. Therefore, is an important aspect of personal appearance. Today, with the increasing development of multimedia technology, people are in urgent need of a method that can automatically identify and change hairstyles, and computer vision technology makes this demand come true. Through the computer vision model, it is possible to automatically identify, analyze, and modify the hairstyle factors in the pictures of people, and there is a great practical demand. Customers can try to change different hairstyles through computer models according to their preferences, compare the effects, and then go to the barber to achieve the most satisfactory hairstyle. [0003] At present, t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T3/00G06N3/08G06K9/00
CPCG06N3/084G06T3/0012G06V40/161
Inventor 付彦伟尹伟东马一清姜育刚薛向阳
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products