Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stereoscopic fitting method based on kinect feature parameter extraction

A technology of characteristic parameters and parameters, applied in image data processing, data processing applications, collaborative operation devices, etc., can solve the problems of reducing clothing purchase efficiency, wasting consumers' time, and limited clothing quantity

Active Publication Date: 2017-07-04
NINGBO UNIV
View PDF5 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, there are also great disadvantages when consumers enter clothing physical stores to buy clothing: due to the huge number of consumers, the number of clothing displayed in clothing physical stores is limited. Once multiple consumers want to try on the same clothing or the same When buying a piece of clothing, every consumer needs to line up to try on the clothing one by one, which will waste a lot of time for consumers, and consumers will walk out of the clothing store because they don't want to line up, thus reducing the relationship between clothing stores and consumers. This is another major reason why online shopping for clothing is favored by consumers

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stereoscopic fitting method based on kinect feature parameter extraction
  • Stereoscopic fitting method based on kinect feature parameter extraction
  • Stereoscopic fitting method based on kinect feature parameter extraction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] The present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0068] Such as figure 2 Shown, the three-dimensional fitting method based on somatosensory feature parameter extraction in the present embodiment is used for by mobile Kinect camera, RFID scanner, at least one clothing with RFID tag, clothing model parameter database, human body model parameter database , the central processing unit, the touch screen and the three-dimensional fitting system formed by the fitting background generator, the Kinect camera belongs to the prior art, and will not go into details here. For the three-dimensional fitting system, see figure 1 Shown; This three-dimensional fitting method comprises the following steps 1 to 8:

[0069] Step 1: Scan the RFID tags on each garment with an RFID scanner to obtain the garment parameters corresponding to each garment, and save the acquired garment parameters into the garment mod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a stereoscopic fitting method based on kinect feature parameter extraction. According to the invention, a stereoscopic fitting system is mentioned and includes a mobile Kinect camera, a RFID scanner, at least one garment with the RFID label, a garment model parameter database, a human body model parameter database, a central processor, a touch display and a fitting background generator. The method herein includes the following steps: after constructing a garment fitting model and generating a human body original dynamic model, a fitting background generator generating a fitting background; the human body model parameter database generating a human body actual dynamic state model which is matchingly provided with a fitting person real face part image, and obtaining a virtual garment fitting model by the garment model parameter database; the garment model parameter database self-adaptively sleeving the virtual garment fitting model to a corresponding human body actual dynamic state model; the touch display setting the maximum area for deformation when deformation occurs as the virtual garment fitting model is on display and the velocity for deformation of the virtual garment fitting model so as to enable a client to acquire satisfied garments accordingly.

Description

technical field [0001] The invention relates to the field of three-dimensional fitting, in particular to a three-dimensional fitting method based on extraction of somatosensory feature parameters. Background technique [0002] When buying clothes in traditional clothing physical stores, people try them on to know whether the clothes they want to buy match their body shape and whether they are satisfied with the wearing effect. However, when encountering a major holiday, especially in a clothing store in a large shopping mall, a large number of consumers will enter the clothing store to try on clothes to buy clothes that they are satisfied with. [0003] However, there are also great disadvantages when consumers enter clothing physical stores to buy clothing: due to the huge number of consumers, the number of clothing displayed in clothing physical stores is limited. Once multiple consumers want to try on the same clothing or the same When buying a piece of clothing, every c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06Q30/06G06K17/00G06T17/00
CPCG06K17/0029G06Q30/0643G06T17/00
Inventor 郑紫微赵婷骆绪龙郭建广
Owner NINGBO UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products