Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-modal intention reverse active fusion human-computer interaction method

A human-computer interaction, multi-modal technology, applied in the field of human-computer interaction, can solve the problem of low accuracy of the real intention of the elderly, and achieve the effect of accurate intention recognition

Pending Publication Date: 2020-12-18
UNIV OF JINAN
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The human-computer interaction method for the reverse active fusion of multi-modal intentions provided by the present invention aims to solve the problem of identifying the true intentions of the elderly due to the ambiguous expression of the elderly in the process of human-computer interaction with the elderly in the prior art The problem of low accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-modal intention reverse active fusion human-computer interaction method
  • Multi-modal intention reverse active fusion human-computer interaction method
  • Multi-modal intention reverse active fusion human-computer interaction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0054] combined reference figure 1 with figure 2 As shown, the present invention provides a human-computer interaction method for reverse active fusion of multimodal intentions, including:

[0055] S100, acquiring environment data, user gesture data and user voice data; specifically, acquiring environment data in video format and user gesture data through an RGB-D depth camera, and acquiring voice data in audio format through a microphone. When collecting environmental data, the RGB-D depth camera rotates 360° horizontally to collect environmental data, and records the moment of collecting environmental data.

[0056] S200, perform scene perception on the environmental data to obtain environmental information, specifically, refer to image 3 with Figure 4 As shown, performing scene perception on the environment...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-modal intention reverse active fusion human-computer interaction method. The method comprises the steps of obtaining environment data, gesture data of a user and voicedata of the user; performing scene perception on the environment data to obtain environment information, performing gesture information extraction on the gesture data to obtain a gesture intention, and performing voice information extraction on the voice data to obtain a voice intention; performing multi-modal intention extraction on the environment information, the gesture intention and the voiceintention to obtain a fusion intention; carrying out credibility evaluation on the fusion intention to obtain a target fusion intention; and performing interactive feedback according to the target fusion intention. The fusion intention is obtained by combining multi-modal extraction of the environment moment, the gesture data and the voice data, so that intention recognition is more accurate; theold people can be prevented from repeating a certain intention due to forgetfulness; and it is determined whether the fusion intention with relatively low occurrence probability is the intention of the user or not in an active inquiry mode, and obtaining a target fusion intention reflecting the intention of the user.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular to a human-computer interaction method for reverse active fusion of multimodal intentions. Background technique [0002] According to the survey, 27.1% of the elderly in our country live alone or with their wives. This number is still increasing with time, and the supply of elderly care workers will be in short supply in the future society. Therefore, it has become an urgent need for society to let robots take care of the elderly instead of young people, and let robots become "nanny" in the new era. Few of the current robotic systems are designed for the characteristics of the elderly. The characteristics of the elderly, such as ambiguity and forgetfulness, often make it difficult for robot escorts to understand the intentions of the elderly. [0003] Due to the limited cultural level of the older generation or the decline in expressive ability with age, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G10L15/26G06F40/30G06N3/04G06N3/08G06F16/587
CPCG06F3/017G10L15/26G06F40/30G06N3/08G06F16/587G06V40/28G06V20/40G06V20/46G06N3/045
Inventor 冯志全郎需婕郭庆北徐涛杨晓晖范雪田京兰
Owner UNIV OF JINAN
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products