Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Dressed human body three-dimensional bare body model calculation method through single Kinect

A three-dimensional clean body and human body technology, applied in computing, image data processing, instruments, etc., can solve problems that cannot be reconciled quickly, economically, and accurately

Inactive Publication Date: 2015-07-22
ZHEJIANG UNIV
View PDF4 Cites 44 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Aiming at the problem that the existing personalized three-dimensional net body model calculation method mentioned above cannot reconcile the contradiction between fast, economical and accurate, the present invention provides a method for calculating a three-dimensional net body model of a dressed human body using a single Kinect

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dressed human body three-dimensional bare body model calculation method through single Kinect
  • Dressed human body three-dimensional bare body model calculation method through single Kinect
  • Dressed human body three-dimensional bare body model calculation method through single Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0073] Embodiments of the present invention are as follows:

[0074] Step 1: According to figure 1 The figure in the figure is to install the experimental equipment, make the dressed human body stand at a distance of about 2 meters from the Kinect to rotate for one circle, and record the movement process at a speed of about 25 frames per second. The recorded data includes a total of about 200 frames of RGB-D images and bones attitude. Among them, the depth image is a function of pixel coordinates, which is transformed into a three-dimensional coordinate system ( image 3 (a)), and use the bounding box method to separate the human body depth image from the background ( image 3 (b)).

[0075] Step 2: Use the implicit surface method to denoise the human body depth image frame by frame. The denoising results are as follows: Figure 4 shown. Since th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dressed human body three-dimensional bare body model calculation method through one single Kinect. The method includes calibrating a Kinect camera, capturing a dressed human body RGB-D images and frame information, and extracting human body deep images before denoising; dividing the human body into multiple rigid parts, acquiring partial human body deep images by segmenting, acquiring a three-dimensional human body shape by registering and infusing, and acquiring a complete three-dimensional human body shape by matching and combining before denoising; acquiring a three-dimensional bare body sample model to construct matrix and reduce the dimensionality, and acquiring low-dimensional three-dimensional bare body sample matrix and mapping matrix; selecting a template model to be flushed with initial frame postures; fitting the template model to the three-dimensional human body shape through corresponded points to project to the low-dimensional matrix, multiplying by the mapping matrix, and acquiring a three-dimensional bare body module; repeating iterations until a final three-dimensional bare body model is acquired. By the aid of the method, the problem of excessive fitting of the dressed human body is solved, and a personalized three-dimensional bare body model can be calculated for the dressed human body rapidly, conveniently and economically through single Kinect equipment.

Description

technical field [0001] The invention relates to a method for calculating a human body model, in particular to a method for calculating a three-dimensional net body model for a dressed human body by using a single Kinect, which can conveniently calculate a real three-dimensional net body model for a dressed human body. Background technique [0002] Personalized 3D mannequins play a vital role in the field of virtual reality. Movie animation, 3D game design, virtual try-on and 3D clothing design are all closely related to personalized 3D mannequins. Therefore, there is an urgent need to quickly, economically and accurately construct personalized 3D human body models. The existing 3D human body model construction methods can be roughly divided into the following categories: geometric interaction methods, 3D scanning methods, depth image registration methods and methods using multiple Kinects to obtain depth images. Among them, the geometric interaction method is divided into t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
CPCG06T2207/10016G06T2207/10028G06T2207/30196
Inventor 陈光李基拓曾继平王贝陆国栋
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products