Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Virtual fitting method, system and device and storage medium

A technology of virtual fitting and human model, applied in neural learning methods, image data processing, biological neural network models, etc., can solve the problems of clothing matching and realism disadvantages, too much attention to the fixed distance between clothes and skin, etc. Good accuracy and authenticity, accurate linkage, and improved motion matching effect

Pending Publication Date: 2022-03-01
北京陌陌信息技术有限公司
View PDF0 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this skinning method pays too much attention to the fixed distance between the clothes and the skin, which has an advantage in fitting speed, but has a big disadvantage in terms of clothing matching and authenticity, and is only suitable for some clothes that need to be processed quickly and easily. When moving with the skin mesh

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual fitting method, system and device and storage medium
  • Virtual fitting method, system and device and storage medium
  • Virtual fitting method, system and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The characteristics and exemplary embodiments of various aspects of the present invention will be described in detail below. In order to make the purpose, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only configured to explain the present invention, not to limit the present invention. It will be apparent to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is only to provide a better understanding of the present invention by showing examples of the present invention.

[0057] It should be noted that in this article, relational terms such as first and second are only used to distinguish one entity or operation from another entity or operation, and do ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a virtual fitting method. The method comprises the following steps: making a three-dimensional model of a garment according to a two-dimensional image of the garment; fitting the three-dimensional clothing model to the three-dimensional standard human body model; obtaining a two-dimensional image of a target human body; inputting the obtained three-dimensional human body parameters into a standard human body model for fitting; matching the clothing three-dimensional model to the three-dimensional human body model; and the three-dimensional human body model is basically consistent with the target human body posture. The invention provides a method for obtaining accurate human body three-dimensional model parameters by analyzing a whole human body photo through a deep neural network, and human body modeling can be quickly performed only by one common photo; meanwhile, the three-dimensional garment model is obtained by processing the two-dimensional garment picture in advance, the user does not need to participate in the background work and only needs to select the garment style which the user wants to virtually try on, the system can automatically match the corresponding garment model, and the authenticity and the reduction degree are kept through a series of methods.

Description

technical field [0001] The invention belongs to the field of user virtual dressing and fitting, in particular to human body modeling used in virtual dressing, clothing modeling, and fitting of clothing models and human body models, especially extracting relevant information from photos based on machine learning A matching method, system, device and storage medium for a generated custom human body model and a three-dimensional clothing model. Background technique [0002] With the development of Internet technology, online shopping is becoming more and more popular. Compared with shopping in physical stores, online shopping has the advantages of a wide variety of products and convenient shopping. However, there are also some problems that are not easy to solve when buying goods online. The most important thing is that it is impossible to check the goods to be bought on the spot. Of all the product categories, this issue is most prominent for apparel products. Compared with...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/20G06Q30/06G06N3/08
CPCG06T17/20G06Q30/0643G06N3/08
Inventor 郑天祥闫浩男周润楠张胜凯杨超杰张涛
Owner 北京陌陌信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products