Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Generative adversarial network model-based hairstyle changing method

A replacement method and hairstyle technology, which is applied in biological neural network models, neural learning methods, graphics and image conversion, etc., and can solve problems that do not involve hairstyle features.

Active Publication Date: 2017-12-29
FUDAN UNIV
View PDF5 Cites 49 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The model part of the present invention is based on the image GAN method transfer network of the above-mentioned papers to extract facial features, but each paper only extracts a specific facial feature, and does not involve hairstyle features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Generative adversarial network model-based hairstyle changing method
  • Generative adversarial network model-based hairstyle changing method
  • Generative adversarial network model-based hairstyle changing method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Step 1. Collection of hairstyle pictures and labeling of attribute categories;

[0027] Step 2. Face detection, cut out the location of the face, which is convenient for the next step of processing by the deep learning neural network;

[0028] Step 3. Build a deep neural network. figure 2 The structure of the designed deep neural network;

[0029] Step 4. Train the deep neural network. After preparing the face picture and the corresponding attribute information, the training of the deep network is carried out. After the input training image is encoded by the neural network, the hidden vector is obtained. According to this latent vector and the specified hairstyle attributes, the generated picture is obtained from the decoding network. For the generated picture, it is sent to the discriminant network and the recognition network to learn the attributes corresponding to the generated picture, and the feature map of the discriminant network is used as the reconstruction...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention belongs to the computer image processing technical field and relates to a generative adversarial network model-based hairstyle changing method. There are a plenty of practical applications about automatic hairstyle changing in the classification and image editing field. According to the method of the present invention, a new large hairstyle data set HAIRSTYLE30k is provided, wherein the new large hairstyle data set HAIRSTYLE30k contains hairstyles composed of 64 different types of 30K images; and a hairstyle automatic generation and modification model H-GAN is provided to realize hairstyle automatic change. According to the method of the invention, improvements are made on the basis of the basic generative adversarial network model, and therefore, new data sets can be efficiently learned. The method not only performs well on a basic data set but also has good generalization on a new data set.

Description

technical field [0001] The invention belongs to the technical field of computer image processing, and in particular relates to a hairstyle replacement method based on a generative confrontational network model. Background technique [0002] Hairstyles can express your personality, confidence and attitude. Therefore, is an important aspect of personal appearance. Today, with the increasing development of multimedia technology, people are in urgent need of a method that can automatically identify and change hairstyles, and computer vision technology makes this demand come true. Through the computer vision model, it is possible to automatically identify, analyze, and modify the hairstyle factors in the pictures of people, and there is a great practical demand. Customers can try to change different hairstyles through computer models according to their preferences, compare the effects, and then go to the barber to achieve the most satisfactory hairstyle. [0003] At present, t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00G06N3/08G06K9/00
CPCG06N3/084G06V40/161G06T3/04
Inventor 付彦伟尹伟东马一清姜育刚薛向阳
Owner FUDAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products