Unlock instant, AI-driven research and patent intelligence for your innovation.

Human-computer interaction method and intelligent equipment

A technology of smart devices and interactive objects, applied in the field of the Internet of Things, can solve the problems of poor user interaction experience, reduce the reach rate of smart device function services, etc., and achieve the effect of improving the reach rate

Pending Publication Date: 2022-02-15
ALIBABA (CHINA) CO LTD
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the above-mentioned existing interaction methods are all active interactions initiated by users, and smart devices can only interact passively, which seriously reduces the reach rate of functional services provided by smart devices to users, and the user's interactive experience is poor.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method and intelligent equipment
  • Human-computer interaction method and intelligent equipment
  • Human-computer interaction method and intelligent equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0029] refer to Figure 1A , shows a flow chart of steps of a human-computer interaction method according to Embodiment 1 of the present application.

[0030] The human-computer interaction method of the present embodiment includes the following steps:

[0031] Step S102: Obtain the multimodal data collected by the smart device for the interactive objects in the space environment where it is located.

[0032] Wherein, the multimodal data includes at least two of the following modal data: voice data, image data, and touch data for smart devices.

[0033] Smart devices with multi-modal data collection functions are usually equipped with a variety of different receiving devices or sensors, such as cameras, microphones, touch screens, pressure sensors, distance sensors, infrared sensors, etc., to ensure the effective collection of multi-modal data .

[0034] It should be noted that in the embodiment of the present application, the collected multimodal data of the interactive obj...

Embodiment 2

[0053] refer to Figure 2A , shows a flow chart of steps of a human-computer interaction method according to Embodiment 2 of the present application.

[0054] In the human-computer interaction method of this embodiment, in addition to performing the operations described in the first embodiment, the smart device can also fully consider the current emotion of the interactive object when interacting, and adopt an appropriate interaction style and interactive object. to interact.

[0055] The human-computer interaction method of the present embodiment includes the following steps:

[0056] Step S202: Obtain the multimodal data collected by the smart device for the interactive objects in the space environment where it is located.

[0057] In this embodiment, in addition to at least two of voice data, image data, and touch data for smart devices, the multimodal data also includes current emotion data of the interactive object, which can represent the current state of the interacti...

Embodiment 3

[0076] refer to Figure 3A , shows a flow chart of steps of a human-computer interaction method according to Embodiment 3 of the present application.

[0077] Different from the foregoing embodiments, the smart device in this embodiment can adjust its own state to respond according to the motion state of the interactive object and the positional relationship with the smart device.

[0078] The human-computer interaction method of this embodiment includes:

[0079] Step S302: Obtain the multimodal data collected by the smart device for the interactive objects in the space environment where it is located.

[0080] Wherein, the multimodal data includes at least two of the following modal data: voice data, image data, and touch data for smart devices.

[0081] Step S304: Perform behavior detection of the interactive object based on the multimodal data.

[0082] Wherein, the behavior detection includes at least one of the following: pedestrian detection, distance detection, face...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a man-machine interaction method and intelligent equipment, and the man-machine interaction method comprises the steps: obtaining multi-modal data collected by the intelligent equipment for an interaction object in a space environment where the intelligent equipment is located, wherein the multi-modal data comprises at least two of the following modal data: voice data, image data and touch data for the intelligent equipment; performing multi-modal clustering on the multi-modal data, and acquiring attribute information and behavior modal data of the interaction object according to a multi-modal clustering result; and actively interacting with the interaction object according to the attribute information and the behavior modal data. Through the embodiment of the invention, the intelligent equipment can initiate interaction actively to communicate with the interaction object, so that the reaching rate of the function service provided by the intelligent equipment to the user is improved, and the interaction experience of the user is improved.

Description

technical field [0001] The embodiments of the present application relate to the technical field of the Internet of Things, and in particular to a human-computer interaction method and an intelligent device. Background technique [0002] With the development of artificial intelligence technology and terminal technology, smart devices are increasingly used in people's work and life. [0003] Under normal circumstances, users mostly interact with smart devices through voice, and voice interaction can basically be used as a key core interaction method for smart devices. Therefore, the quality of voice interaction directly determines the degree of interaction (interaction participation) between users and smart devices. Taking smart speakers as an example, users can interact with smart speakers through voice on the one hand; on the other hand, users can also interact with other devices bridged by smart speakers (such as smart TVs, smart refrigerators, smart air conditioners, etc.)...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/16G06F3/01G06K9/62G06V10/762G06N3/02G06F40/35G06F40/289
CPCG06F3/167G06F3/011G06F3/0488G06N3/02G06F40/35G06F40/289G06F18/23
Inventor 朱益鲍懋钱能锋张文杰
Owner ALIBABA (CHINA) CO LTD