Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Virtual dressing image generation method and system

A virtual dressing and image technology, applied in the field of virtual dressing images, can solve problems such as technical barriers, high computing costs, and long production cycles of 3D clothing modeling, and achieve optimized fitting images, small calculations, and reduced storage space and network transfer time effects

Pending Publication Date: 2018-01-19
深圳市云之梦科技有限公司
View PDF3 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the production cycle of 3D clothing modeling is long, and the high-realistic 3D physical simulation and cloth material rendering have high computing costs, and there are still many technical obstacles

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual dressing image generation method and system
  • Virtual dressing image generation method and system
  • Virtual dressing image generation method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0061] The embodiment of the present invention provides off-line preprocessing, including:

[0062] Standard human material processing;

[0063] Pre-defined grid design and production;

[0064] Determination of the conversion relationship between body shape semantic parameters and grid control parameters;

[0065] Clothing image material processing;

[0066] In a specific embodiment, the shape and skin of a standard human body are provided in this embodiment, and the shape of the standard human body is consistent with the contour projected by the parameters of the model camera used to shoot clothing.

[0067] In a specific embodiment, the standard human body in this embodiment is close to the human eye, and a camera lens with a focal length of 35mm or 50mm or 85mm is used, and the camera is 2 to 4 meters away from the model for shooting.

[0068] Concrete embodiment, in this implementation, the human body skin is processed according to the skin of a live model, and the resu...

example

[0099] Concrete examples, in this embodiment such as Figure 8 is the deformed mesh under the control parameters.

[0100] In a specific embodiment, in this embodiment, the dressed image is rendered, and the deformed dressed image is generated through GPU rendering or CPU pixel calculation based on the grid data and textures obtained above.

[0101] In a specific embodiment, after the dressed image is generated, the user can readjust the body shape parameters. At this time, there is no need to change the texture, only the deformation grid needs to be recalculated; the user can also change the clothing, and only the texture needs to be replaced at this time.

[0102] Concrete embodiment, in this embodiment Figure 9 , Figure 10 , Figure 11 and Figure 12 They are the comparison of two different clothes before and after deformation.

Embodiment 2

[0104] The embodiment of the present invention provides an off-line preprocessing unit, including:

[0105] Standard human body material processing module;

[0106] Predefined grid design and production modules;

[0107] A module for determining the conversion relationship between body shape semantic parameters and grid control parameters;

[0108] Garment image material processing module;

[0109] In a specific embodiment, this embodiment provides a standard human body shape and skin module, and the standard human body shape is consistent with the contour projected by the parameters of the model camera used to shoot clothing.

[0110] In a specific embodiment, the standard human body in this embodiment is close to the human eye, and a camera lens with a focal length of 35mm or 50mm or 85mm is used, and the camera is 2 to 4 meters away from the model for shooting.

[0111] In a specific embodiment, the human body skin module in this implementation is obtained according to t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a virtual dressing image generation method and system. The method comprises the steps of offline preprocessing and online operation, wherein offline preprocessing comprises thesteps of standard body material treatment, predefined grid design and production, determination of a translational relation from body type semantic parameters to grid control parameters and clothing image material treatment; and online operation comprises the steps of acquiring user data, calculating the grid control parameters, reading a body image and a clothing image, calculating a transformedgrid and rendering a dressing image. Through the generation method and system, quick transformation and clothing change are realized; synchronous transformation of the body image and the clothing image is realized; through the method of changing a predefined grid model, a fitting image is controlled and optimized conveniently; and one set of parametric grid system can be suitable for clothes of different shapes and texture. Besides, the method and system are suitable for clothing, shoes, accessories and images not shot realistically.

Description

technical field [0001] The invention relates to the field of computer graphics, in particular to a method and system for virtual dressing images. Background technique [0002] With the development of information processing technologies such as computer graphics, many schemes that can realize virtual try-on have been developed. Through the virtual try-on system, users do not need to actually put on clothes, but only need to provide their images to the virtual try-on system to see the effect of the virtual try-on. This kind of virtual try-on system has a wide range of applications. For example, designers can use the virtual try-on system to assist clothing design. With the development of network technology, for ordinary users, this virtual try-on system is especially suitable for Online interactive systems such as online shopping and virtual communities. [0003] The existing virtual try-on can be mainly divided into two technical realization paths: two-dimensional and three...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T19/20G06Q30/06G06T15/04
Inventor 高学星
Owner 深圳市云之梦科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products