Realistic virtual human multi-modal interaction implementation method based on UE4

An implementation method and virtual human technology, applied in the field of real-sense virtual human multi-modal interaction, can solve problems such as lack of voice input, lack of professional fields, and inability to give answers and responses, and achieve a wide-spread and easy-to-accept effect

Pending Publication Date: 2020-09-29
长沙千博信息技术有限公司
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, most digital avatars are presented as cartoon characters, and only have simple expressions or lip expressions, which cannot be applied in most fields.
Existing game engines, especially the next-generation game engines represented by UE4, have rendering effects that are difficult for human eyes to distinguish between real and fake in terms of realistic rendering of characters and scenes, but lack of voice input, professional fields or general-purpose question-and-answer systems. Therefore, it is impossible to give an answer response that conforms to human behavior habits

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Realistic virtual human multi-modal interaction implementation method based on UE4
  • Realistic virtual human multi-modal interaction implementation method based on UE4

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Hereinafter, the present invention will be further described with reference to the accompanying drawings.

[0038] Before proceeding with the statement of specific implementation, some terminology needs to be explained:

[0039] UE4 is the abbreviation of UNREAL ENGINE 4, Chinese: Unreal Engine 4, UE4 is currently the world's most well-known top game engine with the most extensive authorization.

[0040] BlendShape is a vertex deformation animation, which is generally used for expression production.

[0041] Maya is a world-leading software application for 3D digital animation and visual effects produced by Autodesk.

[0042] Substance is a powerful 3D texture map production software.

[0043] UMG is the abbreviation of Unreal Motion Graphics, Chinese: Unreal Motion Graphics Interface Designer, UMG is the UI interface production module in the UE4 editor;

[0044] NVBG is the abbreviation of NonVerbal-Behavior-Generator, Chinese: non-verbal behavior generator.

[004...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a realistic virtual human multi-modal interaction implementation method based on UE4. The method comprises the steps of resource manufacturing, resource assembling and functionmanufacturing. The system comprises a resource making module used for role model making, facial expression BlendShape making, skeletal skin binding, action making, map making and material adjustment;a resource assembly module which is used for carrying out scene building, light design and UI interface building; and the function making module is used for identifying voice input of the user, carrying out intelligent answering according to the input, playing voice, lip animation, expression animation and body movement and reflecting interaction multi-modality. The module specifically comprisesa voice recognition module, an intelligent question and answer module, a Chinese natural language processing module, a voice synthesis module, a lip animation module, an expression animation module and a body movement module. The system has affinity similar to that of a real person and can be accepted by a user more easily; the interactive habit of human beings is better met, and the application has a wider popularization range; therefore, the application is truly intelligent, and the response of the application is more in line with human logic.

Description

technical field [0001] The invention relates to the field of computer software, in particular to a UE4-based method and system for realizing multi-modal interaction of realistic virtual humans. Background technique [0002] The current digital virtual human technology research is very hot. As the next generation of intelligent human-computer interaction, digital virtual human has a very wide range of commercial application scenarios, such as virtual idols, virtual actors, virtual anchors, etc. will be used in games, film and television, financial services, Industries such as education and medical care are on the ground. [0003] However, most digital avatars are presented as cartoon characters, and only have simple expressions or lip expressions, which cannot be applied in most fields. Existing game engines, especially the next-generation game engines represented by UE4, have rendering effects that are difficult for human eyes to distinguish between real and fake in terms o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T13/40
CPCG06T13/40
Inventor 郭松睿贺志武高春鸣
Owner 长沙千博信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products