Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Single video content-driven clothing 3D animation generation method

A three-dimensional animation, content-driven technology, used in animation production, image data processing, instruments, etc., can solve problems such as difficulty in generating animation, and achieve the effects of simple operation, improved fidelity, and wide application range.

Active Publication Date: 2017-10-20
BEIHANG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The cloth animation generation method based on multi-video 3D reconstruction can generate cloth animation similar to the real cloth dynamic effect, but often requires a complex data acquisition system, and the structure and motion of the tested clothing are relatively simple; data-driven estimated cloth simulation parameters, which can describe the internal mechanical properties of the cloth more accurately, and can improve the fidelity of the cloth animation to a certain extent, but in the real world, clothing always deforms under complex external conditions, and it is difficult to generate and Animation similar to real clothing movement
The 3D clothing model construction method based on the outline can use a single image to construct a 3D clothing model, but it can only generate a 3D clothing model corresponding to a limited number of human body poses.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Single video content-driven clothing 3D animation generation method
  • Single video content-driven clothing 3D animation generation method
  • Single video content-driven clothing 3D animation generation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] The present invention will be further described below in conjunction with the accompanying drawings, so that those of ordinary skill in the art can implement it after referring to this specification.

[0020] see figure 1 , is the overall design framework of the method of the present invention. According to the input video image, first extract the boundary of the video clothing, and estimate the human body model sequence corresponding to the video content; then iteratively generate a 3D clothing animation similar to the video content frame by frame, and each frame iteration includes the initial shape generation of the clothing 3D animation and clothing Three-dimensional animation shape correction two steps.

[0021] Step 1: video clothing boundary extraction and unclothed human body 3D model sequence generation. To extract the video clothing boundary, first use mature tools to segment the clothing pixel area from the video image, and then extract the clothing boundary...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a three-dimensional garment animation generation method driven by single video content. The method comprises steps as follows: a single video about dressed human body movement and an initial three-dimensional garment model corresponding to the single video are provided, a garment boundary sequence of the video content is extracted according to a human body in the video and the garment content of the human body, and a dress-free human body three-dimensional model sequence corresponding to the video content is estimated; on the basis, frame-by-frame iteration is performed from the initial three-dimensional garment model corresponding to the video according to the garment boundary sequence and the dress-free human body three-dimensional model sequence, that is, according to the initial three-dimensional garment model corresponding to the video or a three-dimensional garment animation generation result of a former frame, a three-dimensional garment animation of a current frame is generated by utilizing the dress-free human body three-dimensional model sequence as drive and adopting a cloth animation generation method based on a non-stretching deformation constraint model; according to the difference between the current-frame three-dimensional garment animation generated in the former step and the garment boundary of the corresponding video frame, the three-dimensional garment animation of the current frame is subjected to deformation processing, and the current-frame three-dimensional garment animation effect similar to the video content is generated.

Description

technical field [0001] The invention relates to the field of computer animation generation, in particular to a clothing three-dimensional animation generation method driven by single video content. Background technique [0002] Virtual reality (Virtual Reality, referred to as VR) takes computer technology as the core and combines relevant science and technology to generate a digital environment that is highly similar to a certain range of real environments in terms of sight, hearing, and touch. Objects interact and influence each other, which can produce the feeling and experience of being in the corresponding real environment. With the continuous development of social productivity and science and technology, the demand for virtual reality technology in various industries is increasingly strong, people pay more and more attention to the research of virtual reality technology, virtual reality technology has also made great progress, and gradually become a new field of science...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T13/20
Inventor 陈小武周彬赵沁平郭侃李发明卢飞翔
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products