Method and device for driving virtual human in real time, electronic equipment and medium

A virtual human and duration technology, applied in the field of virtual human processing, can solve the problems of long computing process, poor real-time driving performance, and one hour or several hours, so as to improve real-time performance, improve parallel computing capability, and reduce the amount of computing. and the effect of data transfer volume

Pending Publication Date: 2021-11-23
BEIJING SOGOU TECHNOLOGY DEVELOPMENT CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Because people are too familiar with real people, it takes a lot of time to obtain the 3D static model to make the 3D static model very real, but when driving the 3D static model to perform actions, even a subtle expression will be remodeled. As a result, modeling requires a large amount of data for calculation, and the calculation process is long. Usually, an action of the model may take an hour or several hours of calculation to achieve, resulting in very poor real-time performance of the driver.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for driving virtual human in real time, electronic equipment and medium
  • Method and device for driving virtual human in real time, electronic equipment and medium
  • Method and device for driving virtual human in real time, electronic equipment and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0090] refer to figure 2 , which shows a flow chart of the steps of Embodiment 1 of a method for driving a virtual human in real time according to the present invention, which may specifically include the following steps:

[0091] S201. Obtain data to be processed for driving the virtual person, where the data to be processed includes at least one of text data and voice data;

[0092] S202. Use an end-to-end model to process the data to be processed, and determine an acoustic feature sequence, a facial feature sequence, and a body feature sequence corresponding to the data to be processed;

[0093] S203. Input the acoustic feature sequence, the facial feature sequence and the limb feature sequence into the trained muscle model, and drive the virtual human through the muscle model;

[0094] Wherein, step S201 includes:

[0095] Step S2021, acquiring the text feature and duration feature of the data to be processed;

[0096] Step S2022. Determine the acoustic feature sequenc...

Embodiment 2

[0117] refer to Figure 4 , which shows a flow chart of the steps of Embodiment 1 of a method for driving a virtual human in real time according to the present invention, which may specifically include the following steps:

[0118] S401. Obtain data to be processed for driving the virtual person, where the data to be processed includes at least one of text data and voice data;

[0119] S402. Use an end-to-end model to process the data to be processed, and determine the fusion feature data corresponding to the data to be processed, wherein the fusion feature sequence is an acoustic feature sequence corresponding to the data to be processed, and the face It is obtained by fusing the feature sequence and the limb feature sequence;

[0120] S403. Input the fusion sequence into the trained muscle model, and drive the avatar through the muscle model;

[0121] Wherein, step S402 includes:

[0122] Step S4021, acquiring the text feature and duration feature of the data to be proces...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a method for driving a virtual human in real time, and the method comprises the steps: obtaining to-be-processed data used for driving the virtual human, wherein the to-be-processed data comprises at least one of text data and voice data; processing the to-be-processed data by using an end-to-end model, and determining an acoustic feature sequence, a facial feature sequence and a limb feature sequence corresponding to the to-be-processed data; and inputting the acoustic feature sequence, the facial feature sequence and the limb feature sequence into a trained muscle model, and driving a virtual human through the muscle model. In this way, the acoustic feature sequence, the facial feature sequence and the limb feature sequence can be obtained in a shorter time through the end-to-end model; and then the obtained sequence is input into the muscle model to directly drive the virtual human, so that the calculation amount and the data transmission amount are greatly reduced, the calculation efficiency is also improved, and the real-time performance of driving the virtual human is greatly improved.

Description

technical field [0001] The embodiments of this specification relate to the technical field of virtual human processing, and in particular to a method, device, electronic device and medium for real-time driving of a virtual human. Background technique [0002] Digital Human, referred to as Digital Human, is a comprehensive rendering technology that uses computers to simulate real humans. It is also known as virtual humans, super-realistic humans, and photorealistic humans. Because people are too familiar with real people, it takes a lot of time to obtain the 3D static model to make the 3D static model very real, but when driving the 3D static model to perform actions, even a subtle expression will be remodeled. As a result, modeling requires a large amount of data for calculation, and the calculation process is long. Usually, an action of the model may take an hour or several hours of calculation to be realized, resulting in very poor real-time performance of the driver. Co...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G10L21/10G10L25/30G06N3/04G06N3/08
CPCG10L21/10G10L25/30G06N3/08G06N3/045G06N3/04
Inventor 樊博陈伟陈曦孟凡博刘恺张克宁段文君
Owner BEIJING SOGOU TECHNOLOGY DEVELOPMENT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products