Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dressing human body shape estimation method based on deep neural network

A deep neural network and human body technology, applied in the field of neural networks to achieve accurate detection results and expand application scenarios

Active Publication Date: 2020-02-04
PLEX VR DIGITAL TECH CO LTD
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, these methods still can only deal with fitted or tight clothing, and cannot obtain accurate human body shapes in the case of loose clothing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dressing human body shape estimation method based on deep neural network
  • Dressing human body shape estimation method based on deep neural network
  • Dressing human body shape estimation method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0020] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0021] As an embodiment of the present invention, a method for estimating body shape of a clothed human body based on a deep neural network includes the following steps:

[0022] 1. Use a high-definition ring camera array composed of 80 4K resolution high-definition ring cameras to shoot static characters wearing different types of clothes, such as figure 1 As shown, the different types of clothes include loose, close-fitting, and tight-fitting (short-sleeved ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a dressing human body shape estimation method based on a deep neural network, and the method comprises the steps: constructing a three-dimensional clothes grid model database with abundant clothes types, and training a deep neural network model for predicting the fitness of a human body grid based on the database. According to the method, the problem of estimating the realbody shape of the user under complex dressing is solved; clothes types, grid geometric structures and human body posture actions are comprehensively considered, accurate detection results are providedfor complex human body actions and clothes types, the method is not limited to input of a grid sequence any more, a three-dimensional model of a single human body can also be input, and application scenes of body shape estimation are expanded.

Description

technical field [0001] The invention relates to a neural network method, in particular to a method for estimating the body shape of a clothed human body based on a deep neural network. Background technique [0002] With the popularity of 3D scanners and the emergence of mobile 3D scanning sensors based on structured light and ToF (Time of Flight), these devices have expanded the way to obtain 3D images with depth information, making 3D human body models more and more popular. More commonly, high-quality 3D human models can be obtained through Kinect-based depth cameras and ring color camera systems (Dome). However, almost all existing methods perform 3D reconstruction without considering human clothing, or more precisely, without considering clothing fit. In fact, there will be obvious differences between the geometry of the human body and the geometry of the clothes worn by the human body. Referring to the clothing specifications of clothing manufacturers, clothing can be ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/20G06N3/04
CPCG06T17/20G06T2200/08G06N3/045
Inventor 陈欣庞安琪张哿王培豪张迎梁
Owner PLEX VR DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products