Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A three-dimensional human body virtual reconstruction method and device

A virtualization and human body technology, applied in the field of computer vision, can solve problems such as the inability to restore human body information, lack of external texture information of the dressed human body, etc., and achieve a robust effect

Active Publication Date: 2022-03-11
NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although parametric models can capture the measurement and motion of the human body, they can only generate a naked human body, the 3D surface information of clothes, hair and other accessories is completely ignored, and the external texture information of the clothed human body is lacking, which cannot be restored in the real scene. human body information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A three-dimensional human body virtual reconstruction method and device
  • A three-dimensional human body virtual reconstruction method and device
  • A three-dimensional human body virtual reconstruction method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0034] figure 1 A flowchart showing a three-dimensional human body virtualization reconstruction method provided by an embodiment of the present invention;

[0035] Such as figure 1 As shown, the method of the embodiment of the present invention mainly includes the following steps:

[0036] S1: Use a camera to take a standard T-pose picture of the human body posture, and input the T-pose picture into the first neural network model to obtain a three-dimensional human body shape model; wherein the first neural network model uses a large number of real human body posture images in advance. train.

[0037] In this embodiment, four color cameras are respectively placed at arbitrary edge positions in the scene, so that they can capture four color images of the human body from different perspectives. In some other embodiments, different numbers of camera devices may also be provided according to different viewing angles. The photographic equipment includes cameras, video cameras ...

Embodiment 2

[0078] Furthermore, as an implementation of the methods shown in the above embodiments, another embodiment of the present invention also provides a three-dimensional human body virtual reconstruction device. This device embodiment corresponds to the foregoing method embodiment. For the convenience of reading, this device embodiment does not repeat the details in the foregoing method embodiment one by one, but it should be clear that the device in this embodiment can correspond to the foregoing method implementation. Everything in the example. In the device of this embodiment, there are following modules:

[0079] 1. Obtain the human body three-dimensional appearance model module: use camera equipment to take standard T-pose pictures of human body postures, and input the T-pose pictures into the first neural network model to obtain the human body three-dimensional appearance model; the first neural network model uses a large number of real Human body posture images are used fo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional human body virtual reconstruction method and device, belonging to the technical field of computer vision. The method includes optimizing the currently commonly used parametric model of the human body (such as STAR), so that it can be bound to the three-dimensional shape model of the human body constructed through a pre-trained neural network using Tpose photos of the human body; at the same time, the method utilizes multiple cameras to obtain real-time The multi-view 3D posture of the human body is more robust to severely occluded scenes. Compared with a single camera, it can estimate the 3D joint point coordinates of the human body in the scene more accurately, making the obtained posture parameters more complete and accurate. When the parameters drive the bound 3D model of the human body, real-time accurate and multi-view animation effects can be achieved.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a method and device for virtual reconstruction of a three-dimensional human body. Background technique [0002] In computer vision, 3D human body reconstruction refers to the process of reconstructing 3D human body information from single-view or multi-view 2D images, which has broad application prospects in virtual reality. AR (Augmented Reality, Augmented Reality) technology "seamlessly" integrates real and virtual world information, including new means such as real-time video display, 3D modeling, real-time tracking and registration, scene fusion, etc., and realizes remote visual interaction. For this reason, applying human body reconstruction technology to AR remote interaction can realize the reproduction of real human body objects in virtual 3D scenes. [0003] When performing 3D virtual reconstruction of real people, their 3D body shape and posture are the most im...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00G06T15/04G06T7/73G06T19/00G06N3/04G06N3/08
CPCG06T17/00G06T15/04G06T7/73G06T19/006G06N3/08G06T2207/30196G06T2207/20081G06N3/045
Inventor 谢良韩松洁张敬印二威闫慧炯罗治国张亚坤艾勇保闫野
Owner NAT INNOVATION INST OF DEFENSE TECH PLA ACAD OF MILITARY SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products