Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Generation method of human body action editing model, storage medium and electronic device

A human body motion and model technology, applied in the field of human body motion editing model generation, can solve problems such as animation failure, inability to ensure output, smooth and natural motion animation, etc., to achieve the effect of increasing the reuse value and reducing the amount of human intervention

Pending Publication Date: 2019-12-03
SHENZHEN UNIV
View PDF8 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Errors at any stage during manual data preprocessing can easily cause the final animation to fail, failing to ensure a smooth and natural motion animation output

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Generation method of human body action editing model, storage medium and electronic device
  • Generation method of human body action editing model, storage medium and electronic device
  • Generation method of human body action editing model, storage medium and electronic device

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0082] Example 1: If Figure 9a As shown, the initial action sequence is the action sequence in which the subject moves the 0kg box from the table to the cabinet, such as Figure 9b As shown, when the attribute of the target object is 25kg, the generated motion sequence of the subject moving the 25kg box from the table to the cabinet obtained by the output of the human action editing model is obtained by Figure 9b Given the generative motion sequence, it can be concluded that the subject cannot move it to the cabinet.

example 2

[0083] Example 2: If Figure 10a As shown, the initial action sequence is the action sequence in which the subject picks up a cup whose volume is equal to the volume of the cup and drinks water, such as Figure 10b As shown, when the target object attribute is that the volume of water in the cup is 0, the human action editing model outputs the generated motion sequence of the subject picking up the cup with the volume of water equal to 0 to drink water, by Figure 10b Given the generated motion sequence, it can be concluded that the subject needs to raise his arm when drinking water.

[0084] Based on the generation method of the above-mentioned human motion editing model, the present invention also provides a method for synthesizing human motion based on object attributes, which uses the human motion editing model described in the above embodiment, and the method includes:

[0085] Acquiring target object attributes and an initial action sequence, the initial action sequence...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a generation method of a human body action editing model, a storage medium and an electronic device. The method comprises the following steps: taking an initial action sequence, target object attributes and a target action sequence as training samples; inputting the initial action sequence and the target object attribute into a preset neural network model, and obtaining a generated action sequence output by the preset neural network model; and training the preset neural network model according to the target action sequence and the generated action sequence to obtain a trained human body action editing model. Through using an interaction action sequence of a person and an object as a training sample, the human body action style migration is realized only by utilizingthe object attributes, so that the human intervention amount in the data preprocessing process is reduced, the human-object interaction movement matched with the attributes can be correspondingly generated by applying different attributes of the same object, and the reuse value of the action capture data is improved.

Description

technical field [0001] The invention relates to the technical field of computer graphics, in particular to a method for generating a human body action editing model, a storage medium and electronic equipment. Background technique [0002] At present, motion capture technology uses software and hardware systems to record the real movement information of the human body, which can obtain smooth, natural and delicate character movements, and copy the movement process to a virtual person or character to generate realistic character animation. Therefore, with the increasing demand for 3D animation characters in film and television, games and entertainment, motion capture has become a hot issue in computer graphics research. However, motion capture is targeted motion capture for a specific character in a specific environment. When the scene or creation purpose changes, it is necessary to recapture new motion or readjust the original data frame by frame. This not only wastes a lot ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T13/40G06N3/08G06T17/00
CPCG06T13/40G06N3/08G06T17/00G06N3/045G06T19/20
Inventor 黄惠郑倩潘翰廷
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products