Human-computer interaction method and intelligent equipment
A technology of smart devices and interactive objects, applied in the field of the Internet of Things, can solve the problems of poor user interaction experience, reduce the reach rate of smart device function services, etc., and achieve the effect of improving the reach rate
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0029] refer to Figure 1A , shows a flow chart of steps of a human-computer interaction method according to Embodiment 1 of the present application.
[0030] The human-computer interaction method of the present embodiment includes the following steps:
[0031] Step S102: Obtain the multimodal data collected by the smart device for the interactive objects in the space environment where it is located.
[0032] Wherein, the multimodal data includes at least two of the following modal data: voice data, image data, and touch data for smart devices.
[0033] Smart devices with multi-modal data collection functions are usually equipped with a variety of different receiving devices or sensors, such as cameras, microphones, touch screens, pressure sensors, distance sensors, infrared sensors, etc., to ensure the effective collection of multi-modal data .
[0034] It should be noted that in the embodiment of the present application, the collected multimodal data of the interactive obj...
Embodiment 2
[0053] refer to Figure 2A , shows a flow chart of steps of a human-computer interaction method according to Embodiment 2 of the present application.
[0054] In the human-computer interaction method of this embodiment, in addition to performing the operations described in the first embodiment, the smart device can also fully consider the current emotion of the interactive object when interacting, and adopt an appropriate interaction style and interactive object. to interact.
[0055] The human-computer interaction method of the present embodiment includes the following steps:
[0056] Step S202: Obtain the multimodal data collected by the smart device for the interactive objects in the space environment where it is located.
[0057] In this embodiment, in addition to at least two of voice data, image data, and touch data for smart devices, the multimodal data also includes current emotion data of the interactive object, which can represent the current state of the interacti...
Embodiment 3
[0076] refer to Figure 3A , shows a flow chart of steps of a human-computer interaction method according to Embodiment 3 of the present application.
[0077] Different from the foregoing embodiments, the smart device in this embodiment can adjust its own state to respond according to the motion state of the interactive object and the positional relationship with the smart device.
[0078] The human-computer interaction method of this embodiment includes:
[0079] Step S302: Obtain the multimodal data collected by the smart device for the interactive objects in the space environment where it is located.
[0080] Wherein, the multimodal data includes at least two of the following modal data: voice data, image data, and touch data for smart devices.
[0081] Step S304: Perform behavior detection of the interactive object based on the multimodal data.
[0082] Wherein, the behavior detection includes at least one of the following: pedestrian detection, distance detection, face...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


