Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Entrance-control-free unmanned store checkout method based on pure vision

A store and visual technology, applied in the field of access-free unmanned store checkout based on pure vision, can solve a lot of manpower and other problems, and achieve the effect of increasing business hours, reducing financial resources and saving time.

Pending Publication Date: 2021-06-04
HARBIN INST OF TECH
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The present invention solves the problem that existing shopping stores require a large amount of manpower to checkout. The present invention provides a checkout method based on pure vision without access control and unmanned stores. The present invention provides the following technical solutions:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Entrance-control-free unmanned store checkout method based on pure vision
  • Entrance-control-free unmanned store checkout method based on pure vision
  • Entrance-control-free unmanned store checkout method based on pure vision

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment 1

[0038] according to Figure 1-Figure 4 As shown, the present invention provides a checkout method based on pure vision without access control and unmanned store, comprising the following steps:

[0039] A checkout method for an access-free unmanned store based on pure vision, comprising the following steps:

[0040] Step 1: Train the action discrimination model to determine the action of the customer to take, take or put back for a scene;

[0041] The step 1 is specifically:

[0042] Step 1.1: Train an action discrimination model, obtain continuous RGB video frame stream and optical flow information from the video recorded by the camera, extract features with the help of neural network, detect actions through the extracted features, and judge each frame of the video. Whether there is a pick-and-place action, the frame where this action occurs is recorded as a key frame;

[0043] Step 1.2: For a scene, respectively determine the time stamps of the key frames taken or put bac...

specific Embodiment 2

[0058] Take and put back judgment: Train an action discrimination model to judge each frame of the video to see if there is a take and put back action. The frame where this action occurs is recorded as a key frame. For a scene, find out the timestamps of the keyframes that are picked up or put back in, respectively. Collect these time stamps as the time stamp of the entire scene, and then take frames close to these time stamps from the 12 videos, take 3 frames before and 10 frames after. All timestamps represent the number of actions to be taken or put back, so that we can find as many as possible. The purpose of taking 3 frames before and 10 frames after the same timestamp in the 12 videos is to better detect the goods in the hand. Then you have to judge whether to take it or put it back. If there is a product in the hand in the first few frames of the key frame, and there is no product in the hand in the next few frames, it is the put operation; in the few frames before th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an access control-free unmanned store checkout method based on pure vision. The invention relates to the technical field of artificial intelligence management, and the method specifically comprises the steps: training an action discrimination model, and determining the taking, taking or putting back actions of a customer for a scene; building a convolutional neural network model, carrying out learning training on the product pictures, predicting, classifying and testing the pictures, and detecting commodity types in hands; establishing a relative relationship between the commodity and the customer, and selecting the customer closest to the commodity as an initiator of the action; and carrying out person weight and face recognition on the customer, determining the identity of the customer and carrying out checkout. The problem that an existing shopping store needs a large amount of manpower to carry out checkout is solved, the defects in the prior art are overcome, and the technical effects that site selection is flexible, the sales efficiency can be improved, the service time can be prolonged, and needed financial resources are greatly reduced are achieved.

Description

technical field [0001] The invention relates to the technical field of artificial intelligence management, and is a pure vision-based checkout method for an unmanned store without access control. Background technique [0002] At present, there are salespersons or cashiers in general stores, and the goods purchased by users are settled through the salespersons or cashiers. However, when the number of users purchasing goods is large, they often need to queue up, so that it takes a long time to check out. In addition, the labor cost of the salesperson or cashier needs to be provided. [0003] Therefore, in order to solve the above-mentioned problems, unmanned stores have appeared in the prior art, and its site selection is flexible, which can improve sales efficiency and increase business hours. At present, unattended vending boxes generally use RFID tags attached to commodities as scanning marks. However, there are several disadvantages to pasting RFID: (1) The cost of using ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00G06N3/04
CPCG06V40/173G06V40/107G06N3/045G06F18/2155G06F18/2411
Inventor 李治军张倩倩
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products