Multimodal fusion human-computer interaction method, device, storage medium, terminal and system

A human-computer interaction, multi-modal technology, applied in the computer field, can solve the problems of rigid interaction and monotonous feedback form, achieve the effect of reasonable and humanized feedback, enrich the feedback form, and improve the experience of human-computer interaction

Active Publication Date: 2021-07-02
SUZHOU BOZHON ROBOT CO LTD
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, when the current interactive robot interacts with humans, the feedback form is monotonous and the interaction is blunt, which needs to be improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multimodal fusion human-computer interaction method, device, storage medium, terminal and system
  • Multimodal fusion human-computer interaction method, device, storage medium, terminal and system
  • Multimodal fusion human-computer interaction method, device, storage medium, terminal and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0047] figure 2 It is a schematic flowchart of a multi-modal fusion human-computer interaction method provided by Embodiment 2 of the present invention. The method is optimized on the basis of the above-mentioned embodiment, and the preset multi-modal fusion model is divided into multiple sub-models.

[0048] Exemplarily, the preset multimodal fusion model includes multiple sub-models; the interaction data is input into the preset multimodal fusion model, and determined according to the output result of the preset multimodal fusion model The interaction feedback data of the robot includes: extracting sub-sample data respectively corresponding to a plurality of sub-models from the interaction data; inputting each sub-sample data into corresponding sub-models to obtain a plurality of sub-output results; The sub-output results determine the interaction feedback data of the robot. The advantage of this setting is that the output of each sub-model is more targeted, and, due to th...

Embodiment 3

[0068] image 3 It is a schematic flowchart of a multi-modal fusion human-computer interaction method provided by Embodiment 3 of the present invention. The method is optimized on the basis of the above-mentioned embodiment, and relevant content of model training is added.

[0069] Exemplarily, before the acquisition of the interaction data corresponding to the target interactive object collected by the robot, the method further includes: obtaining the interaction data of training samples collected by the robot, and determining the sample label corresponding to the interaction data of the training samples based on a preset expert system; The training sample interaction data and corresponding sample labels are input into a preset initial model for training to obtain a preset multi-modal fusion model. The advantage of this setting is that the training sample set can be set more reasonably by using the preset expert system.

[0070] Further, the inputting the training sample int...

Embodiment 4

[0090] Figure 7 A structural block diagram of a multi-modal fusion human-computer interaction device provided in Embodiment 4 of the present invention. The device can be realized by software and / or hardware, and generally can be integrated in a terminal, and can be implemented by executing a multi-modal fusion human-computer interaction method for human-computer interaction. Such as Figure 7 As shown, the device includes:

[0091] The interaction data acquisition module 701 is configured to acquire the interaction data corresponding to the target interactive object collected by the robot, wherein the interaction data includes audio data, micro-expression data, distance data and posture data;

[0092] An interaction feedback data determination module 702, configured to determine the interaction feedback data of the robot based on preset rules according to the interaction data, wherein the interaction feedback data includes voice feedback data including tone information, mic...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses a multimodal fusion human-computer interaction method, device, storage medium, terminal and system. The method includes: acquiring interaction data corresponding to a target interaction object collected by the robot, wherein the interaction data includes audio data, micro-expression data, distance data, and posture data; determining the interaction feedback data of the robot based on preset rules according to the interaction data, Among them, the interactive feedback data includes voice feedback data including tone information, micro-expression feedback data and action feedback data; the interactive feedback data is used to control the robot to perform corresponding interactive feedback operations. By adopting the above-mentioned technical solution, the embodiment of the present invention can make the feedback of the robot more reasonable and humanized, enrich the forms of feedback, and help to improve the experience of human-computer interaction.

Description

technical field [0001] The embodiments of the present invention relate to the field of computer technology, and in particular to a multi-modal fusion human-computer interaction method, device, storage medium, terminal and system. Background technique [0002] A robot is a machine device that automatically performs work. It can accept human commands, run pre-programmed programs, and act according to principles and programs formulated with artificial intelligence technology. Its task is to assist or replace human work. It can be applied In industries such as manufacturing, construction or other hazardous industries. [0003] At present, interactive robots that require human-computer interaction, such as service robots, are emerging as a brand-new industry. At the same time, the market and demand for service robots are also growing rapidly. According to the latest market forecast of the International Federation of Robotics, by 2020, the global The total market size of service ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16
CPCB25J9/16B25J9/1689
Inventor 孙骋苏衍宇孙斌张俊杰莫明兴
Owner SUZHOU BOZHON ROBOT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products