Multi-modal intention reverse active fusion man-machine cooperation method and system

A human-machine collaborative, multi-modal technology, applied in reasoning methods, computer parts, character and pattern recognition, etc., can solve problems such as increasing interaction load, and achieve the effect of improving accuracy.

Pending Publication Date: 2022-02-25
UNIV OF JINAN
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For the current service robot, assuming that the elderly may infer multiple different intentions from the unclear expression, the robot can determine the final intention by asking the user one by one for the intention that the user wants to execute, but this greatly increases the interaction load

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal intention reverse active fusion man-machine cooperation method and system
  • Multi-modal intention reverse active fusion man-machine cooperation method and system
  • Multi-modal intention reverse active fusion man-machine cooperation method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0050] Embodiment 1 of the present invention proposes a human-machine collaboration method for reverse active fusion of multimodal intentions. The goal of the present invention is to correctly understand the intentions expressed by users and assign collaborative interaction tasks. Human-computer collaborative interaction tasks require humans and robots. done together.

[0051] like figure 1 It is an overall block diagram of a human-computer collaboration method for reverse active fusion of multimodal intentions in Embodiment 1 of the present invention; the complete interaction process should be divided into 6 stages, namely, user input, information processing, intention recognition, trust evaluation, Task collaborative analysis, human-robot collaborative interaction.

[0052] like figure 2 It is a detailed block diagram of a human-machine collaboration method for multi-modal intent reverse active fusion in Embodiment 1 of the present invention; the system framework consists...

Embodiment 2

[0086] Based on the human-machine collaboration method for reverse active fusion of multi-modal intentions proposed in Embodiment 1 of the present invention, Embodiment 2 of the present invention also proposes a human-machine collaborative system for reverse active fusion of multi-modal intentions, as shown in Figure 5 It is a schematic diagram of a human-machine collaboration system for reverse active fusion of multimodal intentions in Embodiment 2 of the present invention, the system includes an acquisition module, an analysis module, an evaluation module and an assignment module;

[0087] The obtaining module is used to obtain the modality information of the user; the modality information includes voice information, gesture information and body posture information;

[0088] The analysis module is used to analyze the intention based on the modal information and deduce the direct intention of the user, and the direct intention obtains the indirect intention of the user throug...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a multi-modal intention reverse active fusion man-machine cooperation method and system. The method comprises the steps: acquiring modal information of a user, wherein the modal information comprises voice, gesture and posture information; performing intention analysis based on the modal information to infer a direct intention of the user, and obtaining an indirect intention of the user from the direct intention through an inference knowledge base; analyzing an intention with the maximum possibility from the indirect intentions as a real intention; performing credibility evaluation on modal information of the user under the real intention to obtain an executable intention; and analyzing the executable intention, and allocating a cooperative task to the user and the robot. Based on the method, the invention further provides a man-machine cooperation system. The trust degree evaluation method integrates time factors, historical factors, single-mode information entropy and single-mode identification credibility, wrong intentions expressed by the user are avoided, and the accompanying effect is really achieved. A self-adaptive mechanism is adopted, habits of users are used as factors of system decision making, and intention extraction accuracy is improved.

Description

technical field [0001] The invention belongs to the technical field of multi-modal intention fusion, and in particular relates to a human-machine collaboration method and system for reverse active fusion of multi-modal intentions. Background technique [0002] Service robots can bring great convenience to people. While reducing the demand for service industry personnel in human society, they can bring people a higher quality of life. Now many robots have entered the family and become good helpers for the family. However, there are few service robots designed with the characteristics of the elderly in mind, and there are many challenges in the development of elderly care robots. It is a relatively easy task for a young person to realize his ideas, but this task may be difficult for the elderly with various physical declines. For example, if a young person feels unwell, he can take out the medicine by himself and take it according to the dose. However, for the elderly, becau...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/20G06V40/10G06K9/62G06N3/04G06N5/04G10L15/22
CPCG06N5/041G10L15/22G06N3/045G06F18/256G06F18/295
Inventor 冯志全郎需婕
Owner UNIV OF JINAN
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products