Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body model obtaining method and network virtual fitting system based on depth camera

A technology of depth camera and acquisition method, applied in the field of three-dimensional models, can solve the problems of inability to achieve and accurately establish a three-dimensional model of the human body, etc.

Active Publication Date: 2014-08-27
SHENZHEN ORBBEC CO LTD
View PDF7 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] The technical problem to be solved by the present invention is to provide a human body model acquisition method based on a depth camera and a network virtual fitting system to solve the disadvantages of the existing technology that cannot accurately establish a human body three-dimensional model and cannot achieve a real try-on effect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body model obtaining method and network virtual fitting system based on depth camera
  • Human body model obtaining method and network virtual fitting system based on depth camera
  • Human body model obtaining method and network virtual fitting system based on depth camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0062] like figure 1 As shown, the method for obtaining a three-dimensional human body model based on a depth camera provided by the present invention includes the following steps:

[0063] Step S1: Generate marker points covering the surface of the model human body on the surface of the model human body for determining the surface characteristics of the model human body, and at the same time, collect depth images of the model human body from multiple angles through a depth camera, so as to obtain coverage A sequence of depth images on the surface of the model human body, including the landmark poin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a human body model obtaining method and a network virtual fitting system based on a depth camera. The method comprises the steps of 1, generating mark points on the surface of a model human body, meanwhile, collecting depth images of the model human body from multiple angles through the depth camera, and therefore obtaining depth image sequences covering the surface of the model human body and comprising the mark points, wherein the mark points cover the surface of the model human body and are used for determining characteristics of the surface of the model human body; 2, performing target depth information point cloud network reconstruction on each frame of depth images in the depth image sequences; 3, splicing the reconstructed depth images into a three-dimensional model of the model human body according to the reconstructed mark points in the depth images. Compared with the prior art, the human body model obtaining method and the network virtual fitting system based on the depth camera can obtain the accurate human body three-dimensional model and the garment three-dimensional model, virtual fitting is achieved according to the models, and the real fitting effect can be obtained.

Description

technical field [0001] The invention relates to the technical field of three-dimensional models, in particular to a method for acquiring a human body model based on a depth camera and a network virtual fitting system. Background technique [0002] In the field of clothing industry, the use of virtual fitting system can allow customers to browse the try-on effects of different clothes in the shortest time, and it can also help fashion designers accumulate more materials for fitting effects and shorten the cycle of clothing design, which is of great significance. However, the existing virtual fitting system lacks the real try-on effect, and customers cannot determine the size of the clothing according to their own figure. There are some errors in the model, and the clothing model is not 3D. It lacks real details and other effects, and does not consider real effects such as fabrics and folds. The methods used in some virtual fitting systems to obtain 3D models of the human bod...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T19/00H04N13/00H04N13/02
CPCG06T17/00G06T2210/16G06T2207/10016G06T7/33G06T3/4038G06T2200/32G06T2207/10028G06T2207/30196H04N13/257H04N13/282G06T7/75G06T19/20G06T2200/08G06T2219/2012
Inventor 肖振中许宏淮刘龙黄源浩
Owner SHENZHEN ORBBEC CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products