Action simulation interactive method and device for intelligent device and intelligent device

A smart device and action simulation technology, which is applied in the input/output of user/computer interaction, mechanical mode conversion, character and pattern recognition, etc., can solve the problems of no action simulation process, low information dissemination efficiency, and limited information volume. Achieve the effects of vivid interactive forms, accurate information interaction, and improved accuracy

Active Publication Date: 2017-04-05
北京如布科技有限公司
View PDF5 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It leads to low efficiency of information dissemination, lack of vividness, no vivid action simulation process, and limited amount of information expressed

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action simulation interactive method and device for intelligent device and intelligent device
  • Action simulation interactive method and device for intelligent device and intelligent device
  • Action simulation interactive method and device for intelligent device and intelligent device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0120] The principle and steps of the action simulation interaction method for smart devices of the present invention will be described below with reference to Embodiment 1.

[0121] Step 1: Analyze and extract the selected action samples to obtain corresponding action instructions

[0122] First, search the action videos of various animals through the Internet, or use the camera of the intelligent robot to capture the action videos of animals in real time. Action videos of various animals may include action videos of cats, action videos of dogs, and the like. For action videos, you can also customize more detailed action samples, such as cats sleeping, cats walking, dogs barking, etc. These action videos serve as action samples to be imitated.

[0123] Then, feature extraction is performed on the acquired action samples frame by frame, and recognition is performed based on the extracted features to obtain action instructions corresponding to the action samples. Then, split...

Embodiment 2

[0187] The principle and steps of the action simulation interaction method for smart devices of the present invention will be further described with reference to Embodiment 2 below.

[0188] In this embodiment, steps 1 and 2 are the same as in embodiment 1.

[0189] In step 3, the voice command issued by the user is "Imitate a dog barking", and the voice command is converted into the target text as "Imitate a dog barking". The verb phrase (that is, "imitate") and the object (that is, "a dog's barking") in the target text can be obtained through semantic analysis. Therefore, the target event corresponding to the target text is "imitating a dog's barking".

[0190] In step 4, since the event matching model established in step 2 cannot identify "dogs", the probability for all animals is 0:

[0191] p(dog barking|dog)=0, p(dog barking|tiger)=0, p(dog barking|cat)=0

[0192] In this case, the smart device asks the user: "What kind of animal is a dog?", and the user replies: "Dog"...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an action simulation interactive method and device for an intelligent device and the intelligent device. The method includes the following steps: analyzing and extracting a selected action sample, and obtaining corresponding action instructions, wherein the action instructions refer to an action decomposition instruction and an action scheduling instruction; conducting training based on a set corpus and the action instructions, and building an event matching model; converting a received voice signal into a target text, conducting semantic analysis of the target text, and determining a target event corresponding to the target test; based on the event matching model, obtaining a corresponding target action according to the target event; calling the action decomposition instruction and the action scheduling instruction corresponding to the target action, and based on the action decomposition instruction and the action scheduling instruction, driving a corresponding hardware component of the intelligent device.

Description

technical field [0001] The present disclosure relates to the field of smart devices, and in particular, to an action simulation interaction method and device for smart devices and a smart device including the action simulation interaction device. Background technique [0002] The interaction between traditional smart devices and users is limited to text, sound and images, and general smart devices are limited to accepting input in the form of text and output in the form of sound, text or images. For example, to query content such as animal encyclopedias and cartoon images on traditional smart devices, users need to input text for memory query and obtain output results in the form of sound, text, and images. [0003] Most of the newer smart devices have network modules, and the accepted input forms also extend to sound. After receiving the user's voice question, the smart device can upload the user's voice question data through the network module, and after performing voice ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/011G06V40/20
Inventor 吴芷莹叶菲梓郭祥
Owner 北京如布科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products