Unlock instant, AI-driven research and patent intelligence for your innovation.

Method for recognizing intention in cooking process and intelligent cooking equipment

A technology of cooking equipment and cooking process, which is applied in voice recognition, cooking utensils, instruments, etc., and can solve problems such as blocked cooking process, inability to support device context scenarios, and lack of

Active Publication Date: 2021-10-08
JOYOUNG CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] During the cooking process of the user, the robot guides the cooking according to the preset workflow. At this time, if the user asks the robot a question, due to the lack of an intention judgment mechanism, the existing robot does not know whether the corresponding question is related to the current cooking scene. Therefore, It is impossible to accurately determine whether to provide content related to the current cooking scene or provide encyclopedic knowledge. In this case, it may be necessary to conduct multiple rounds of inquiries to the user to supplement the slot data, resulting in obstruction of the cooking process and low efficiency of question and answer.
[0003] The shortcomings of the current intelligent question answering technical solution are mainly reflected in: intent processing is still serial, lack of support for parallel processing of multiple intent questions and answers for the same user; cannot support device context scenarios; the number of slots for multiple rounds of question and answer is fixed, cannot Flexible intent node processing and intent state tracking based on device data, which will face difficulties when dealing with some time-consuming and clear timing scenarios such as cooking guidance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for recognizing intention in cooking process and intelligent cooking equipment
  • Method for recognizing intention in cooking process and intelligent cooking equipment
  • Method for recognizing intention in cooking process and intelligent cooking equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0095] Such as figure 2 As shown, the embodiment of the present invention also provides an intelligent cooking device, including:

[0096] The semantic solution module is configured to receive the voice information of the user and obtain the dictionary data corresponding to the voice information through semantic analysis;

[0097] A status module, configured to acquire status parameters of the sensor of the cooking device and determine the working status data of the cooking device;

[0098] The intention module is configured to determine the current intention of the user according to the dictionary data and the work status data.

[0099] In the embodiment of the present invention, the determination of the user's current intention by the intention module according to the dictionary data and the work status data refers to:

[0100] Perform discretization processing and missing value processing on the dictionary data and the working status data, and determine the support S and...

Embodiment 2

[0104] Such as image 3 As shown, this embodiment illustrates the process of intention recognition in the cooking process:

[0105] S11. Voice recognition: recognize the user's voice through a single microphone or a microphone array, and convert the voice into the voice text actually expressed by the user;

[0106] S12. Sensor collection: collect data through the sensors carried by the cooking equipment or the sensors of the associated robot;

[0107] S13: Semantic understanding: Semantic understanding of speech text or encapsulation of sensor data and speech text into a unified data format for semantic understanding, semantic analysis of speech text in user expressions, and extraction of dictionary data, namely dictionaries and corresponding values , where the user expression data includes but is not limited to content such as voice text is set to text, sensor type is set to sensorType, sensor data is set to data, device model is set to devType, device ID is set to devId, us...

Embodiment 3

[0113] Such as Figure 4 and Figure 5As shown, this embodiment takes the smart device to collect voice information and the cloud server to perform intent recognition as an example to illustrate the process of intent recognition in the cooking process:

[0114] If the device that the user interacts with is a smart device, when the user performs voice interaction, the client of the smart device will simultaneously collect the sensor data of the cooking device carried by the smart device, and then the sensor data and voice data will be packaged together into user data and uploaded to the cloud server; In addition, according to the type of device, the client will also regularly collect the sensor data of the cooking device carried by the smart device, encapsulate it and upload it to the server.

[0115] The smart device refers to devices that support voice interaction, such as smart kitchen appliances, smart robots, etc., and also includes smart terminals such as mobile phones. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present application proposes an intention recognition method in the cooking process and an intelligent cooking device. The method includes: receiving the voice information of the user and acquiring the dictionary data corresponding to the voice information through semantic analysis; acquiring the state parameters of the sensor of the cooking device and Determine the working status data of the cooking equipment; determine the user's current intention according to the dictionary data and the working status data. The invention judges the intention of the user based on the working state determined by the state parameters collected by the cooking device by performing semantic analysis on the voice information of the user, and improves the question answering efficiency of the cooking device.

Description

technical field [0001] The invention relates to the field of intelligent equipment control, in particular to an intention recognition method in a cooking process and an intelligent cooking equipment. Background technique [0002] During the cooking process of the user, the robot guides the cooking according to the preset workflow. At this time, if the user asks the robot a question, due to the lack of an intention judgment mechanism, the existing robot does not know whether the corresponding question is related to the current cooking scene. Therefore, It is impossible to accurately determine whether to provide content related to the current cooking scene or to provide encyclopedic knowledge. In this case, it may be necessary to conduct multiple rounds of inquiries to the user to supplement the slot data, which will hinder the cooking process and inefficient Q&A. [0003] The shortcomings of the current intelligent question answering technical solution are mainly reflected in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): A47J36/00A47J27/00G06F40/30G06F16/332G10L15/26
CPCA47J27/00A47J36/00G10L15/26
Inventor 朱泽春苗忠良王忠
Owner JOYOUNG CO LTD