Virtual try-on method and device based on posture guidance

A virtual try-on and posture technology, applied in the field of virtual try-on, can solve the problems of input and output space mismatch, inability to guarantee, stiff collar part covering the human body, etc., to achieve the effect of maintaining color and texture characteristics

Active Publication Date: 2019-09-06
SHANDONG UNIV
View PDF6 Cites 25 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] 1) Pose transformation will cause a spatial mismatch between the input and output. It is impossible to guarantee that while changing the pose, the clothes can be reasonably covered on the target body, and on the other hand, the characteristics of the characters in other areas except the try-on area are maintained.
[0007] 2) There is no additional body shape input, and the body shape of the target human body is unknown, and the clothes worn on the human body cannot be deformed according to the target human body
[0008] 3) The collar part is rigidly covered on the human body, making the resulting try-on effect unnatural and stiff

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual try-on method and device based on posture guidance
  • Virtual try-on method and device based on posture guidance
  • Virtual try-on method and device based on posture guidance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0037] like figure 1 As shown, a virtual try-on method based on posture guidance in this embodiment includes:

[0038] S101: Extract and utilize the features of the posture key point information of the target human body and the features of the original body shape to predict the body shape of the target human body.

[0039] In a specific implementation, the specific process of predicting the body shape of the target human body in step S101 includes:

[0040] S1011: Use the target body pose key point information and the original body shape information to construct a target body shape prediction network:

[0041]

[0042] in Representative and target human pose key point P B Align the target body size. S A A mask representing the primitive body shape. Θ p Represents network parameters.

[0043] In this embodiment, an encoder-decoder structure is used to construct a target human body shape prediction network, and the S A and P B the cascade as input. Specifically, a...

Embodiment 2

[0097] A gesture-guided virtual try-on device of the present embodiment includes:

[0098](1) predicting the body shape of the target human body, which is used to extract and utilize the features of the posture key point information of the target human body and the features of the original body shape to predict the body shape of the target human body;

[0099] Specifically, in the predicting target human body shape module, an encoder-decoder structure is used to construct a target human body shape prediction network.

[0100] (2) Clothing and target human body matching module, which is used to extract the feature information of clothing by using the first convolutional neural network, and use the second convolutional neural network to extract the cascade of the predicted body shape of the target human body and the pose key points of the target human body feature information, calculate the matching score between the clothing and the target human body, and obtain the deformed cl...

Embodiment 3

[0108] This embodiment provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, realizes the following: figure 1 Steps in the gesture-guided virtual try-on method shown.

[0109] This embodiment solves the online virtual try-on task based on 2D pictures. In order to generate a more realistic try-on picture effect, this embodiment uses the first convolutional neural network to extract the feature information of clothing, and uses the second convolutional neural network to extract predictions. The cascading feature information of the target body shape and the pose key points of the target body is calculated, and the matching score between the clothing and the target body is calculated, and the deformed clothing is obtained based on the score; The cascade feature information of the key points of human body posture is input into the bidirectional generative adversarial network based on the attention mechani...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a virtual try-on method and device based on posture guidance. The virtual try-on method based on posture guidance comprises the following steps: extracting and utilizing characteristics of posture key point information of a target human body and characteristics of an original human body type to predict the body type of the target human body; utilizing the first convolutionalneural network to extract feature information of the clothes, utilizing the second convolutional neural network to extract predicted cascading feature information of the body type of the target humanbody and the posture key points of the target human body, calculating a score of the clothes matched with the target human body, and obtaining the deformed clothes based on the score; and inputting the deformed clothing feature information and the cascading feature information of the original human body shape and the target human body posture key points into a bidirectional generative adversarialnetwork based on an attention mechanism, and finally outputting a try-on synthesized portrait which can keep the same data distribution as the original human body shape.

Description

technical field [0001] The present disclosure belongs to the field of virtual try-on, and in particular relates to a virtual try-on method and device based on posture guidance. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] In recent years, with the development of multimedia technology, the research of online virtual try-on task has received more and more attention. Usually, traditional online virtual try-on systems are mainly based on computer graphics for 3D modeling of the human body or clothes, but 3D modeling usually requires the use of expensive scanner equipment, which is not feasible for many ordinary people. [0004] Clothes in online shopping platforms are usually flat and cannot match a person's body shape. Reasonable geometric bending of the clothes enables the clothes to be adaptively matched with the body shape of the con...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T11/00G06K9/62G06N3/04
CPCG06T11/001G06N3/045G06F18/22
Inventor 刘东岳宋雪萌郑娜陈召峥聂礼强关惟俐
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products