Illumination matching virtual fitting method and device and storage medium

A technology of virtual fitting and light intensity, applied in neural learning methods, 3D image processing, image data processing, etc., can solve the poor time control of clothing model reconstruction, complex calculation methods and calculation principles, and is not suitable for virtual dressing Application and other issues to achieve a high degree of realism

Pending Publication Date: 2022-03-18
北京陌陌信息技术有限公司
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to too much attention to the height restoration of each frame of image, the calculation method and calculation principle used are more complicated, and the method used needs to complete related calculations continuously throughout the whole process, so the time control of clothing model reconstruction is poor. The simulation calculation time is long, and it is not suitable for virtual facelift applications in Internet scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Illumination matching virtual fitting method and device and storage medium
  • Illumination matching virtual fitting method and device and storage medium
  • Illumination matching virtual fitting method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] The characteristics and exemplary embodiments of various aspects of the present invention will be described in detail below. In order to make the purpose, technical solutions and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only configured to explain the present invention, not to limit the present invention. It will be apparent to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is only to provide a better understanding of the present invention by showing examples of the present invention.

[0041]It should be noted that in this article, relational terms such as first and second are only used to distinguish one entity or operation from another entity or operation, and d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual fitting method based on illumination matching. The method comprises the following steps: acquiring a two-dimensional image of a target human body; obtaining a semantic segmentation map of the target human body image; obtaining a two-dimensional image of the garment; matching the illumination intensity and the illumination angle of the prefabricated garment and the target human body image; and selecting the adaptive clothing texture map to make a three-dimensional model of the clothing. According to the method, the trueness and the reduction degree of the three-dimensional garment model are kept through a series of methods, the three-dimensional garment model is obtained by processing the two-dimensional garment picture in advance, a user does not need to participate in the after-curtain work, the system can automatically match the garment model with the corresponding illumination intensity and illumination angle, and the trueness and the reduction degree are kept. The method well adapts to the characteristics and trends of simplicity and rapidness of the internet era, and the user uploads one photo to complete all virtual clothes changing work which the user needs to complete.

Description

technical field [0001] The invention belongs to the field of user virtual dressing and fitting, and in particular relates to human body modeling, clothing modeling, and fitting of clothing models and human body models used in virtual dressing, especially the calculation and extraction of relevant lighting from target human body photos information, and match the virtual fitting method, device and storage medium according to the illumination information that is adapted to it. Background technique [0002] With the development of Internet technology, online shopping is becoming more and more popular. Compared with shopping in physical stores, online shopping has the advantages of a wide variety of products and convenient shopping. However, there are also some problems that are not easy to solve when buying goods online. The most important thing is that it is impossible to check the goods to be bought on the spot. Of all the product categories, this issue is most prominent for...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/20G06T15/50G06T15/04G06Q30/06G06N3/08G06N3/04
CPCG06T17/20G06T15/506G06T15/04G06Q30/0643G06N3/08G06N3/045
Inventor 周润楠杨超杰张涛郑天祥张胜凯周一凡周博生
Owner 北京陌陌信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products