Unlock instant, AI-driven research and patent intelligence for your innovation.
Man-machine voice interaction system and method for shopping bin
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A voice interaction, man-machine technology, applied in the field of intelligent interaction, can solve problems such as lack of intelligent artificial services, affecting shopping efficiency, and high transaction management costs
Inactive Publication Date: 2020-09-11
天津原点品牌管理有限公司
View PDF0 Cites 1 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
[0011] The human-computer interaction function of the shopping warehouse is simplified, and there is no intelligent artificial service, which affects the shopping efficiency
Moreover, the transaction management cost invested in the existing shopping warehouse is high
[0012] In the existing technology, the training sets of smart terminals are usually small sample sets, and the direct application of face recognition algorithms does not work well
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
Embodiment 1
[0095] Such as figure 1 As shown, the present invention provides a human-computer voice interaction method for a shopping warehouse, including:
[0096] S101. Carry out multi-dimensional pronunciation feature division for different users entering and exiting the shopping warehouse, and sample data;
[0097] S102. Obtain information in multiple dimensions by using the user's voice and user ID, and determine a recognition model;
[0098] S103, the user collects a frontal photo image through a camera device;
[0099] S104, detecting the collected image by the face detection device, and performing human eye positioning;
[0100] S105, intercepting the face area in the image by preprocessing software, and performing grayscaleprocessing on it; saving the processed image information as training sample data to the data storage center;
[0101] S106, extracting the SIFT feature in the image by feature extractionsoftware, and saving it to the data storage center as training sample ...
Embodiment 2
[0160] The system of the present invention includes: a dialect layer, a platform layer, an age layer, a gender layer, and a domain layer;
[0161] User ID and user language are respectively sampled for gender, age, platform, and dialect layers through accent discrimination;
[0162] The gender layer includes: male, female;
[0163] Described platform layer comprises: IOS, Android, Windows;
[0164] The dialect layer includes one or a combination of Mandarin and dialects
[0166] (1) Divide the pronunciation characteristics of different users who enter and exit the shopping warehouse in multiple dimensions;
[0167] (2) Perform dynamic updates;
[0168] (3) Architecting multiple domains.
[0169] In (1), the data is sampled according to six dimensions: geographical distribution, accent distribution, noise distribution, age distribution, male-to-female ratio, and device platform.
[0170] The geographical...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
PUM
Login to View More
Abstract
The invention discloses a man-machine voice interaction system and method for a shopping bin, and relates to the technical field of intelligent interaction. The method comprises the steps of samplingdata; acquiring multi-dimensional information by using user voice and a user identifier, and determining an identification model; allowing a user to collect a front-side image through photographing equipment; detecting the acquired image through face detection equipment, and carrying out human eye positioning; intercepting a face area in the image, and performing graying processing; storing the processed data; extracting SIFT features in the image to serve as training sample data to be stored in a data storage center; dynamically updating identification resources according to the division; anddetermining different user information of going in and out of the shopping bin by using a user voice recognition text. According to the man-machine voice interaction method for the shopping bin, thefunctions are diversified, the working efficiency is high, the maintenance cost is low, a new module can be added in later maintenance and updating to meet the requirements of a later system, and thefunctions of the man-machine voice interaction method for the shopping bin are more complete.
Description
technical field [0001] The disclosure of the present invention relates to the field of intelligent interaction technology, in particular to a human-machine voice interaction system and method for a shopping warehouse. Background technique [0002] At the present stage, the manual duty shopping warehouse needs to use manual counting and statistics on the shopping when shopping in and out of the warehouse. After confirming that all the information of the goods in and out of the warehouse is correct, they can enter and leave the shopping warehouse. Therefore, the efficiency of entering and leaving the warehouse is very low and insufficient. In order to support the warehouse business with high warehouse access rate today. Secondly, in order to improve the efficiency of shopping in and out of the warehouse to support the business of the warehouse, it is necessary to increase the staff involved in shopping in and out of the warehouse, which will increase the cost of the business. ...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.