Human-computer interaction method, control device, controlled device and storage medium

A technology of human-computer interaction and control devices, which is applied in the field of controlled devices and storage media, control devices, and human-computer interaction methods, and can solve problems such as false triggering of control commands and voice assistants turning on TVs by mistake

Active Publication Date: 2020-02-28
SHANGHAI PATEO INTERNET TECH SERVICE CO LTD
View PDF11 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the current voice assistant is only accurate in triggering voice control when there are wake-up words. For the natural language mode without wake-up words, it cannot solve how to distinguish the voice receiving object, and it is easy to trigger the control command by mistake. For example, when When the user says "watch TV", there may be two situations, one is that he really wants to turn on the TV at home, and the other may be that the word "watch TV" is included in the process of chatting with other people, when the actual situation belongs to the second In this case, it is easy for the voice assistant to turn on the TV by mistake

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method, control device, controlled device and storage medium
  • Human-computer interaction method, control device, controlled device and storage medium
  • Human-computer interaction method, control device, controlled device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0027] figure 1 It is a schematic flowchart of the human-computer interaction method provided in Embodiment 1 of the present invention. In order to clearly describe the human-computer interaction method provided by Embodiment 1 of the present invention, please refer to figure 1 .

[0028] The human-computer interaction method provided by Embodiment 1 of the present invention includes the following steps:

[0029] S101: Receive a voice signal.

[0030] In one embodiment, the device / device applying the human-computer interaction method provided in this embodiment is in the silence detection state before receiving the voice signal. At this time, the power consumption of the device / device is extremely low, so the device / device The ability of equipment / devices to remain operational for extended periods of time.

[0031] In an embodiment, in step S101, it may further include: when the volume of the received voice signal reaches a certain threshold, enter step S102.

[0032] S10...

Embodiment 2

[0052] figure 2 It is a schematic flowchart of the human-computer interaction method provided by Embodiment 2 of the present invention. In order to clearly describe the human-computer interaction method provided by Embodiment 2 of the present invention, please refer to figure 2 .

[0053] The human-computer interaction method provided by Embodiment 2 of the present invention is applied to a control device, and includes the following steps:

[0054] S201: Receive a voice signal.

[0055] S202: Detect the characteristics of the voice signal source.

[0056] Specifically, the feature of the voice signal source may include the face orientation of the user who sent the voice signal or the relative orientation between the user who sent the voice signal and the controlled device, where the voice signal source includes but is not limited to the user who sent the voice signal. Specifically, after receiving the voice signal, the feature of the voice signal source is detected immed...

Embodiment 3

[0082] image 3 It is a schematic structural diagram of the control device provided by Embodiment 3 of the present invention. In order to clearly describe the control device 1 provided by Embodiment 3 of the present invention, please refer to image 3 .

[0083] join image 3 The control device 1 provided by the third embodiment of the present invention includes: a voice signal receiving module 101 , a feature detection module 102 , a wake-up identification module 103 and a voice command acquisition module 104 .

[0084] Specifically, the voice signal receiving module 101 is used for receiving voice signals.

[0085] In one embodiment, before the voice signal receiving module 101 receives the voice signal, the voice signal receiving module 101 is in the silence detection state, and the power consumption of the control device 1 is extremely low at this time, so that the control device 1 maintains the ability to work for a long time.

[0086] Specifically, the feature detect...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of intelligent control, and relates to a human-computer interaction method, a control device, a controlled device and a storage medium, wherein the human-computer interaction method comprises the steps of receiving a voice signal; detecting the characteristics of a voice signal source, wherein the characteristics of the voice signal source comprise theface orientation of a user sending out the voice signal or the relative orientation of the user and the controlled device; judging whether the voice signal comprises a wake-up word or not; if the voice signal comprises the wake-up word, performing voice instruction recognition on the voice signal to acquire a voice instruction; and if the voice signal does not comprise the wake-up word, when the characteristics of the voice signal source accord with preset characteristics, entering the step of performing voice instruction recognition on the voice signal to acquire the voice instruction, wherein the preset characteristics comprise that the face of the user faces the front face of controlled equipment/the control device or the user is positioned on the front face of the controlled equipment.Therefore, the occurrence of the condition of false triggering of the controlled equipment can be effectively avoided, so that the accuracy of the man-computer interaction method is improved.

Description

technical field [0001] The invention belongs to the technical field of intelligent control, and in particular relates to a human-computer interaction method, a control device, a controlled device and a storage medium. Background technique [0002] With the popularity of smart terminals and the emergence of more and more smart devices and smart homes, human-computer interaction can be regarded as a very core function. With the development of voice recognition technology, more and more smart devices use voice control to realize human-computer interaction. When an existing voice terminal detects a voice control command, it can The mapping relationship, responding to the control code corresponding to the detected voice control command, this type belongs to the voice assistant function in human-computer interaction. At present, most of the smart terminals have voice assistant functions, and generally need to input a specific voice (such as a wake-up word) to complete the trigger...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G10L15/02G10L15/22G10L17/22H04L12/28
CPCG10L15/02G10L15/22G10L17/22H04L12/282G10L2015/223G10L2015/226
Inventor 郭涛杨春阳
Owner SHANGHAI PATEO INTERNET TECH SERVICE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products