Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

CNN-based user interactive image local costume style migration method

An interactive, clothing technology, applied in image enhancement, image analysis, image data processing and other directions, can solve the problems of difficulty in integrating clothing and new styles, lack of content image semantics and depth information, and ineffective methods, and achieve style transfer. Effect

Pending Publication Date: 2020-10-13
BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Initially in the field of vision, image rendering is generally considered as an extended problem of texture synthesis, and new images are generated through texture modeling, but the quality of images generated by this method is not high
There are also studies trying to use the generative confrontation network (GAN) for style transfer. Although good transfer results have been achieved, it should be noted that the method based on GAN is not stable, and its generation is too free, so it needs to be well constrained. Space can stably generate reasonable results, and the generative confrontation network is a data-driven method. A large amount of data is the premise. When the amount of data cannot be satisfied, the method is difficult to be effective
In recent years, the research on image style transfer mainly focuses on feature mapping of content and style through convolutional neural network, and continuously reduces the loss of content features and style features to iteratively generate new images, and has achieved good results. The content details cannot be well preserved in the process, and the semantic and depth information contained in the content image is lacking.
If it is directly applied to the fashion style transfer of clothing, the resolution of the generated clothing image is very low, the shape of the clothing will be deformed and the color of the original clothing will be retained, and the style will be irregularly transferred to the background instead of the local clothing, resulting in the clothing being different from the new clothing. Difficult to mix styles

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • CNN-based user interactive image local costume style migration method
  • CNN-based user interactive image local costume style migration method
  • CNN-based user interactive image local costume style migration method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only part of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0069] The present invention is a CNN-based user interactive image local clothing style migration method, such as figure 1 shown, including the following steps:

[0070] Step 1: Take a clothing image as a content image and a picture as a style image, and input it to the CNN network for feature mapping to obtain content features and style features;

[0071] Step 2: Use the GrabCut algorithm for interactive image segmentation, use a rectangle to fra...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a CNN-based user interactive image local costume style migration method. The method comprises the steps of (1) inputting a content graph and a style graph into a CNN for mapping to obtain content features and style features,(2) interactively segmenting the content graph by using a GrabCut algorithm, framing and extracting local clothes by using a rectangle, and generating alocal clothes contour graph,(3) converting the contour graph into a binary graph, and performing distance conversion to generate a distance conversion matrix,(4) increasing the distance between the inside and the outside of the local clothes contour by using power operation to form contour features,(5) calculating content loss, style loss and contour loss of the random noise graph according to the features, and (6) integrating the three types of losses and adding a regular term so as to carry out smooth denoising on the boundary region. According to the method, a user interaction method is adopted, contour loss is introduced into the picture to achieve reservation of the shape of the clothes and limitation of the style migration area, and style migration of local clothes is efficiently achieved.

Description

technical field [0001] The invention relates to the technical field of image processing and recognition, in particular to a CNN-based user interactive image local clothing style transfer method. Background technique [0002] Image style transfer refers to extracting style from one image and applying it to another image. Initially in the visual field, image rendering is generally considered to be an extension of texture synthesis, and new images are generated through texture modeling, but the image quality generated by this method is not high. There are also studies trying to use generative adversarial networks (GAN) for style transfer. Although a good transfer effect has been achieved, it should be noted that the GAN-based method is not stable, its generation is too free, and its generation needs to be well constrained. Only space can stably generate reasonable results, and as a data-driven method, generative adversarial network requires a large amount of data. When the amo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T3/00G06T7/11G06T7/194G06T7/13G06T11/00G06N3/04
CPCG06T7/11G06T7/194G06T7/13G06T11/001G06T2207/10004G06T2207/20076G06T2207/20081G06T2207/20084G06T2207/20192G06N3/045G06T3/04Y02P90/30
Inventor 熊海涛王涵颍蔡圆媛
Owner BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products