Two-dimensional virtual fitting method based on neural network

A neural network and virtual fitting technology, applied in the fields of computer vision and image processing, can solve the problems of clothing texture loss, complex program, low applicability, etc., and achieve the effect of improving efficiency, high efficiency and low cost

Active Publication Date: 2020-02-28
SUN YAT SEN UNIV
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage of this invention is that the 3D modeling technology is difficult, the hardware cost is high, the 3D scanning technology is required, the applicability is low, and the program is relatively complicated
The disadvantage of this invention is that the combination of neural network and thin-plate spline transformation can easily cause loss of clothing texture and dislocation of clothing shape

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Two-dimensional virtual fitting method based on neural network
  • Two-dimensional virtual fitting method based on neural network
  • Two-dimensional virtual fitting method based on neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, but not all of the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.

[0027] figure 1 It is the overall flow chart of the two-dimensional virtual fitting method of the embodiment of the present invention, such as figure 1 As shown, the method includes:

[0028] S1, filter and input the original character map and target clothing map from the clothing data set, and process them into a unified size;

[0029] S2, further processing the original human figure that has been processed into a uniform size to g...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a two-dimensional virtual fitting method based on a neural network. The method comprises the steps of inputting an original figure graph and a target clothing graph; extractinga human body profile diagram, a human body articulation point diagram and a human body analysis diagram; generating a target human body analysis graph through a codec network; enabling the convolutional neural network to generate a deformed clothing graph; enabling the codec network to generate a rough result graph and a clothing mask graph; and recombining the rough result graph and the deformedgarment graph through the garment mask graph to generate a final effect graph. According to the method, a two-dimensional picture deep learning algorithm is applied, and compared with expensive three-dimensional hardware acquisition equipment and three-dimensional calculation with a large calculation amount, the method has the advantages that the cost is low, and the efficiency is high; a neuralnetwork method is used, a target human body analytic graph is generated through a coding and decoding structure, the neural network can be instruct to retain characteristics of all parts of a person in an original picture to the maximum extent, the convolutional neural network is used for conducting garment deformation on a target garment graph, and texture information of the target garment graphcan be reserved to the maximum extent.

Description

Technical field [0001] The invention relates to the fields of computer vision and image processing, in particular to a neural network-based two-dimensional virtual fitting method. Background technique [0002] With the development of Internet technology, online shopping is becoming more and more popular. Compared with shopping in physical stores, online shopping has the advantages of a wide variety of products and convenient shopping. Among all commodity categories, clothing commodities occupy an important proportion. But online clothing shopping also has some drawbacks. Compared with shopping in physical stores, where you can change clothes in real time and check the effect of clothing, online clothing shopping cannot provide renderings for consumers themselves, and consumers cannot intuitively obtain the impact of clothing on their own image in real time. Virtual fitting technology can solve this problem by providing consumers with clothing renderings. [0003] Virtual...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T3/00G06T5/50G06T7/11G06T7/40G06Q30/06G06N3/04G06N3/08
CPCG06Q30/0643G06T3/0012G06T7/40G06T7/11G06T5/50G06N3/08G06N3/045Y02P90/30
Inventor 苏卓孙峰周凡
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products