Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for planning motion of virtual human

A motion planning and virtual human technology, applied in 3D image processing, image data processing, instruments, etc., can solve the problems of low efficiency, inability to converge, and inability to use model training in the optimization process, etc.

Active Publication Date: 2010-09-22
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF5 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method of reinforcement learning motion synthesis based on linear programming has the following disadvantages: (1) This method is based on such a premise that the value function can be regarded as a linear combination of a set of basis functions. If the basis function is not selected properly, cannot converge; and when the number of basis functions increases, the optimization process is inefficient
Different types of motion cannot be used for model training since they do not have the same constraint frame

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for planning motion of virtual human
  • Method and system for planning motion of virtual human
  • Method and system for planning motion of virtual human

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0052] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0053] The process flow of the virtual human motion planning method of the present invention is as follows: figure 1 shown.

[0054] In step S100, the motion data of the virtual person is obtained through the motion capture device.

[0055] Use various optical and electromagnetic motion capture devices currently on the market, such as the capture device VICON 8 produced by VICON, to collect character motion data samples. The motion capture technology involved in this step is the existing technology. For related equipment and technology, please refer to: http: / / www.vicon.com / .

[0056] Record the collected motion data sequence as {Motion i} i=1 |A| , where |A| is the number of motion segments in the motion data sequence. Each motion segment consists of a set of poses, denoted as Among themPose j i Indicates the pose of frame j in the i-th motion segm...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a method and a system for planning the motion of virtual human. The method comprises the following steps of: firstly, obtaining the motion data of the virtual human through motion capture equipment; secondly, establishing a reinforcement learning model, wherein the reinforcement learning model comprises a state obtained through a sampling environment and an action obtained through sampling motion data, a configured state corresponds to the disposable rewards and punishment value of the action, and the state corresponds to the accumulated rewards and punishment value of the action, and initializing the accumulated rewards and punishment value; thirdly, applying the disposable rewards and punishment value and the accumulated rewards and punishment value, wherein the subsequent state of the state corresponds to the accumulated rewards and punishment value of each action, and carrying out iteration updating on the accumulated rewards and punishment value of the state, which corresponds to the action; and fourthly, for a given state of the virtual human, selecting action from the given state of the virtual human and a set target state from actions obtained by collection according to the accumulated rewards and punishment value of the state, which corresponds to the action. In the invention, different types of action segments can be selected as sample data, and time-consuming calculation can be avoided during motion synthesis.

Description

technical field [0001] The invention designs a virtual human synthesis technology, and in particular relates to a virtual human motion planning method and system thereof. Background technique [0002] In recent years, the application of virtual human synthesis technology has become increasingly widespread. From film and television animation to advertising production, from online games to sports training, from virtual maintenance to safety rehearsal, many fields involve virtual human synthesis technology. However, as the basis of these applications, how to plan and synthesize virtual human motion with certain intelligence is a technical problem that needs to be solved. [0003] Most traditional virtual character animations are manually edited and generated by experienced animators. The animator uses 3D tool software to set the static posture of the human body at the key frame moment, and then generates the intermediate posture between the key frames. Due to the high degree...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T15/70
Inventor 李淳芃宗丹夏时洪王兆其
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products