Three-dimensional human body virtualization reconstruction method and device

A virtualization and human body technology, applied in the field of computer vision, can solve problems such as the inability to restore human body information, lack of external texture information of the dressed human body, etc., and achieve a robust effect

Active Publication Date: 2021-09-21
NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
View PDF6 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although parametric models can capture the measurement and motion of the human body, they can only generate a naked human body, the 3D surface information of clothes, hair and other accessories is completely ignored, and the external texture information of the clothed human body is lacking, which cannot be restored in the real scene. human body information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional human body virtualization reconstruction method and device
  • Three-dimensional human body virtualization reconstruction method and device
  • Three-dimensional human body virtualization reconstruction method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] figure 1 A flowchart showing a three-dimensional human body virtualization reconstruction method provided by an embodiment of the present invention;

[0034] Such as figure 1 As shown, the method of the embodiment of the present invention mainly includes the following steps:

[0035] S1: Use a camera to take a standard T-pose picture of the human body posture, and input the T-pose picture into the first neural network model to obtain a three-dimensional human body shape model; wherein the first neural network model uses a large number of real human body posture images in advance. train.

[0036] In this embodiment, four color cameras are respectively placed at arbitrary edge positions in the scene, so that they can capture four color images of the human body from different perspectives. In some other embodiments, different numbers of camera devices may also be provided according to different viewing angles. The photographic equipment includes cameras, video cameras ...

Embodiment 2

[0075] Furthermore, as an implementation of the methods shown in the above embodiments, another embodiment of the present invention also provides a three-dimensional human body virtual reconstruction device. This device embodiment corresponds to the foregoing method embodiment. For the convenience of reading, this device embodiment does not repeat the details in the foregoing method embodiment one by one, but it should be clear that the device in this embodiment can correspond to the foregoing method implementation. Everything in the example. In the device of this embodiment, there are following modules:

[0076] 1. Obtain the human body three-dimensional appearance model module: use camera equipment to take standard T-pose pictures of human body postures, and input the T-pose pictures into the first neural network model to obtain the human body three-dimensional appearance model; the first neural network model uses a large number of real Human body posture images are used fo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a three-dimensional human body virtualization reconstruction method and device, and belongs to the technical field of computer vision. According to the method, by optimizing a currently common human body parameterized model (such as STAR) to be bound with a human body three-dimensional shape model constructed by using a human body Tpose photo through a pre-trained neural network; and meanwhile, the multi-view three-dimensional posture of the human body is obtained in real time through multiple cameras, the robustness of a severely-shielded scene is higher, the three-dimensional joint point coordinates of the human body in the scene can be more accurately estimated compared with a single camera, obtained posture parameters are more complete and accurate, and when the posture parameters are used for driving the bound human body three-dimensional model, the real-time, accurate and multi-view animation effect is achieved.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a method and device for virtual reconstruction of a three-dimensional human body. Background technique [0002] In computer vision, 3D human body reconstruction refers to the process of reconstructing 3D human body information from single-view or multi-view 2D images, which has broad application prospects in virtual reality. AR (Augmented Reality, Augmented Reality) technology "seamlessly" integrates real and virtual world information, including new means such as real-time video display, 3D modeling, real-time tracking and registration, and scene fusion, realizing remote visual interaction. For this reason, applying human body reconstruction technology to AR remote interaction can realize the reproduction of real human body objects in virtual 3D scenes. [0003] When performing 3D virtual reconstruction of real people, their 3D body shape and posture are the most importa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T15/04G06T7/73G06T19/00G06N3/04G06N3/08
CPCG06T17/00G06T15/04G06T7/73G06T19/006G06N3/08G06T2207/30196G06T2207/20081G06N3/045
Inventor 谢良韩松洁张敬印二威闫慧炯罗治国张亚坤艾勇保闫野
Owner NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products