Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Web platform for interactive design, synthesis and delivery of 3D character motion data

a technology of interactive design and character motion data, applied in the field of video animation, can solve the problems of increasing cost, time-consuming and cumbersome manual animation or using motion capture, and requiring the use of complex equipment and actors

Inactive Publication Date: 2010-10-14
MIXAMO
View PDF5 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0006]Systems and methods in accordance with embodiments of the invention are described for animating 3D characters using synthetic motion data generated by motion models which use pre-defined character motion data to generate new motion data leveraging analogies between the pre-defined motions and combining together one or more of them. The motion data is generated in response to a high level description of a desired sequence of motion provided by an animator. In a number of embodiments, the motion data is generated on a shared server system that utilizes the ability of motion models to generate synthetic motion data across a continuum to enable multiple animators to effectively reuse the same set of previously recorded motion capture data to produce a wide variety of desired animation sequences. In several embodiments, an animator can upload a custom model of a 3D character to the server and the synthetic motion data generated by the generative model is retargeted to animate the custom 3D character. In many embodiments, the synthetic motion data is streamed to a rendering engine located on the animator's local computer. In this way, the processing overhead associated with rendering animations of 3D characters using synthetic motion generated by the shared server can be distributed across a number of local computers.

Problems solved by technology

Animating a 3D character manually or using motion capture can be time consuming and cumbersome.
As discussed above, the manual definition of a character's motion can involve a laborious process of defining and modifying hundreds of motion curves until a desired motion sequence is obtained.
Motion capture requires the use of complex equipment and actors.
In the event that the captured motion is not exactly as desired, the animator is faced with the choice of repeating the motion capture process, which increases cost, or attempting to manually edit the motion curves until the desired motion is obtained, which is difficult.
The inability of animators to rapidly and inexpensively obtain complex motion data for a 3D character can represent a bottleneck for the generation of 3D animations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Web platform for interactive design, synthesis and delivery of 3D character motion data
  • Web platform for interactive design, synthesis and delivery of 3D character motion data
  • Web platform for interactive design, synthesis and delivery of 3D character motion data

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039]Turning now to the drawings, animation systems and methods for real time interactive generation of synthetic motion data for the animation of 3D characters are illustrated. The term synthetic motion data describes motion data that is generated by a machine. Synthetic motion data is distinct from manually generated motion data, where a human animator defines the motion curve of each Avar, and actual motion data obtained via motion capture. Animation systems in accordance with many embodiments of the invention are configured to obtain a high level description of a desired motion sequence from an animator and use the high level description to generate synthetic motion data corresponding to the desired motion sequence. Instead of directly editing the motion data, the animator can edit the high level description until synthetic motion data is generated that meets the animator's needs.

[0040]In a number of embodiments, the animation system distributes processing between a user's comp...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems and methods are described for animating 3D characters using synthetic motion data generated by motion models in response to a high level description of a desired sequence of motion provided by an animator. In a number of embodiments, the synthetic motion data is streamed to a user device that includes a rendering engine and the user device renders an animation of a 3D character using the streamed synthetic motion data. In several embodiments, an animator can upload a custom model of a 3D character or a custom 3D character is generated by the server system in response to a high level description of a desired 3D character provided by the user and the synthetic motion data generated by the generative model is retargeted to animate the custom 3D character.

Description

RELATED APPLICATION[0001]The current application is a continuation-in-part of U.S. patent application Ser. No. 12 / 370,269 filed Feb. 12, 2009 and claims priority to U.S. Provisional. Application No. 61 / 166,117, filed Apr. 2, 2009 the disclosure of which is incorporated herein by reference.FIELD OF THE INVENTION[0002]The present invention relates generally to the generation of video animation and more specifically to the generation of animation using character motion data.BACKGROUND[0003]Three dimensional (3D) character animation has seen significant growth in terms of use and diffusion in the entertainment industry in the last decade. In most 3D computer animation systems, an animator defines a set of animation variables, or Avars that form a simplified representation of a 3D character's anatomy. The Avars are often organized in a hierarchical model and, therefore, the collection of Avars for a 3D character can be referred to as its hierarchical model. Motion of the 3D character can...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G06T15/70
CPCG06T13/40G06T13/20G06T2200/16
Inventor DE AGUIAR, EDILSONGAMBARETTO, EMILIANOCORAZZA, STEFANO
Owner MIXAMO
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products