Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-computer interaction method based on user behaviors and sound box

A human-computer interaction and user technology, applied in the input/output of user/computer interaction, computer components, mechanical mode conversion, etc. Avoid aesthetic fatigue, improve human-computer interaction experience, and enrich the effect of expressions or actions

Pending Publication Date: 2020-05-19
GUANGDONG XIAOTIANCAI TECH CO LTD
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the expressions or actions that can be presented by virtual characters in traditional smart devices are all set by default when the device leaves the factory. machine interaction experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method based on user behaviors and sound box
  • Human-computer interaction method based on user behaviors and sound box
  • Human-computer interaction method based on user behaviors and sound box

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0069] see figure 1 , figure 1 It is a schematic flowchart of a user-behavior-based human-computer interaction method disclosed in an embodiment of the present invention. like figure 1 As shown, the human-computer interaction method based on user behavior may include the following steps:

[0070] 101. Collect user expressions and / or user actions made by the user.

[0071] In the embodiment of the present invention, the execution subject for executing the user-behavior-based human-computer interaction method disclosed in the embodiment of the present invention may be a speaker, a control center that communicates with the speaker, a tablet computer, or a learning machine. Examples are provided for description, and should not be construed to limit the embodiments of the present invention.

[0072] It should be noted that the sound box disclosed in the embodiment of the present invention may include a speaker module, a camera module, a display screen, a light projection module...

Embodiment 2

[0094] see figure 2 , figure 2 It is a schematic flowchart of another user-behavior-based human-computer interaction method disclosed in an embodiment of the present invention. like figure 2 As shown, the human-computer interaction method based on user behavior may include the following steps:

[0095] 201. Collect user expressions and / or user actions made by the user.

[0096] 202. Determine the avatar that the user likes, and generate a virtual expression and / or virtual action for the avatar, where the virtual expression and / or virtual action match the user's expression and / or user action.

[0097] 203. When it is detected that the user makes a certain target user expression and / or target user action, match a target virtual expression and / or target virtual action matching the target user expression and / or target user action.

[0098] 204. Determine whether the target virtual expression and / or the target virtual action is a request for help expression or a request for ...

Embodiment 3

[0118] see image 3 , image 3 It is a schematic structural diagram of a sound box disclosed in an embodiment of the present invention. like image 3 shown, the speaker may include:

[0119] a first collection unit 301, configured to collect user expressions and / or user actions made by the user;

[0120]The first determining unit 302 is configured to determine the avatar that the user likes, and generate virtual expressions and / or virtual actions for the avatar, and the virtual expressions and / or virtual actions match the user's expressions and / or user actions;

[0121] A matching unit 303, configured to match a target virtual expression and / or a target virtual action that matches the target user expression and / or target user action when it is detected that the user makes a certain target user expression and / or target user action;

[0122] The control unit 304 is configured to control the avatar to make target virtual expressions and / or target virtual actions to interact w...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention relates to the technical field of electronic equipment, and discloses a human-computer interaction method based on user behaviors and a sound box, and the method comprises the steps: collecting a user expression and / or a user action made by a user; determining a virtual character interested by the user, and generating a virtual expression and / or a virtual action forthe virtual character, the virtual expression and / or the virtual action being matched with the user expression and / or the user action; when it is detected that the user makes a certain target user expression and / or a target user action, obtaining a target virtual expression and / or a target virtual action matched with the target user expression and / or the target user action through matching; and controlling the virtual character to make a target virtual expression and / or a target virtual action to interact with the user. By implementing the embodiment of the invention, the expressions or actions of the virtual character in the intelligent equipment can be enriched, and the human-computer interaction experience of a user can be improved.

Description

technical field [0001] The invention relates to the technical field of electronic equipment, in particular to a user behavior-based human-computer interaction method and a sound box. Background technique [0002] With the development of artificial intelligence technology, more and more smart devices are equipped with virtual characters developed based on artificial intelligence technology. Because they can show human-like expressions or actions, virtual characters are used to represent smart devices to communicate with users. Interaction can shorten the distance between the user and the device and improve the user's human-computer interaction experience. However, the expressions or actions that can be presented by virtual characters in traditional smart devices are all set by default when the device leaves the factory. machine interaction experience. Contents of the invention [0003] The embodiment of the present invention discloses a human-computer interaction method b...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/017G06F2203/012G06V40/174
Inventor 尚宇翔
Owner GUANGDONG XIAOTIANCAI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products