Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Stereoscopic fitting method based on somatosensory feature parameter extraction

A technology of characteristic parameters and parameters, applied in data processing applications, image data processing, collaborative devices, etc., can solve the problems of wasting consumers' time, reducing the efficiency of clothing purchases, and the limited number of clothing.

Active Publication Date: 2020-12-11
NINGBO UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, there are also great disadvantages when consumers enter clothing physical stores to buy clothing: due to the huge number of consumers, the number of clothing displayed in clothing physical stores is limited. Once multiple consumers want to try on the same clothing or the same When buying a piece of clothing, every consumer needs to line up to try on the clothing one by one, which will waste a lot of time for consumers, and consumers will walk out of the clothing store because they don't want to line up, thus reducing the relationship between clothing stores and consumers. This is another major reason why online shopping for clothing is favored by consumers

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stereoscopic fitting method based on somatosensory feature parameter extraction
  • Stereoscopic fitting method based on somatosensory feature parameter extraction
  • Stereoscopic fitting method based on somatosensory feature parameter extraction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] The present invention will be further described in detail below with reference to the accompanying drawings.

[0068] like figure 2 As shown in perspective based on fitting method according somatosensory feature extraction according to the present embodiment, for the clothing, the clothing model parameter database, the human body model parameter database of the mobile Kinect camera, RFID scanner, with at least one RFID tag perspective fitting system central processor, a touch screen and a background generator fitting formed, Kinect camera belongs to the prior art, here is not described in detail here, stereo system Referring fitting figure 1 FIG; stereoscopic fitting the method comprising the steps 1 through Step 8:

[0069] Step 1, the RFID tag scanned by the RFID scanner on the clothing, clothing obtaining parameters corresponding to the clothing, clothing to save the acquired parameters to the model parameter database clothing, clothing apparel parameters are then genera...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a stereoscopic fitting method based on kinect feature parameter extraction. According to the invention, a stereoscopic fitting system is mentioned and includes a mobile Kinect camera, a RFID scanner, at least one garment with the RFID label, a garment model parameter database, a human body model parameter database, a central processor, a touch display and a fitting background generator. The method herein includes the following steps: after constructing a garment fitting model and generating a human body original dynamic model, a fitting background generator generating a fitting background; the human body model parameter database generating a human body actual dynamic state model which is matchingly provided with a fitting person real face part image, and obtaining a virtual garment fitting model by the garment model parameter database; the garment model parameter database self-adaptively sleeving the virtual garment fitting model to a corresponding human body actual dynamic state model; the touch display setting the maximum area for deformation when deformation occurs as the virtual garment fitting model is on display and the velocity for deformation of the virtual garment fitting model so as to enable a client to acquire satisfied garments accordingly.

Description

Technical field [0001] The present invention relates to a stereoscopic fitting, and more particularly relates to a stereoscopic try-based method somatosensory feature extracted parameters. Background technique [0002] When buying clothing in a traditional clothing store was trying to understand people through clothing if desired purchase meets their size and wearing satisfactory. However, when faced with major holidays, especially in the clothing store in the shopping mall, a large number of consumers will enter the clothing store trying on clothes, to buy their own satisfaction the effect of wearing clothes. [0003] However, there are also significant drawbacks when consumers into the store to buy clothing Clothing: Due to the huge number of consumers, a limited number of entities clothing clothing store display, in the event of multiple consumers at the same time would like to try on a dress or with the same when pieces of clothing, the needs of every consumer eleven line up ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06Q30/06G06T17/00G06K17/00
CPCG06K17/0029G06Q30/0643G06T17/00
Inventor 郑紫微赵婷骆绪龙郭建广
Owner NINGBO UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products