Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Intelligent goods pickup identification method based on depth vision

A recognition method and deep vision technology, applied in neural learning methods, character and pattern recognition, instruments, etc., can solve the problems of high cost of RFID tags, high labor costs, and low intelligence of smart containers

Pending Publication Date: 2020-09-22
无锡雪浪数制科技有限公司
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the operating cost of RFID-based smart containers is very high. On the one hand, it is necessary to manually affix RFID tags to all commodities, which has high labor costs, and the cost of RFID tags themselves is also quite expensive, which has become an additional expense that suppliers and consumers have to bear.
On the other hand, the intelligence of traditional smart containers is relatively low, and the number of types of goods that can be supplied inside is usually limited by the number of areas in the container. Each hard partition can only place one or one type of item, and the storage of items is flexible. extremely low sex

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Intelligent goods pickup identification method based on depth vision
  • Intelligent goods pickup identification method based on depth vision
  • Intelligent goods pickup identification method based on depth vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0066] The specific embodiments of the present invention will be further described below with reference to the accompanying drawings.

[0067] The present application discloses a deep vision-based intelligent pick-up and identification method. The method flow chart is as follows: figure 1 As shown, the identification method includes the following steps:

[0068] Step 1: Obtain the user operation image through the camera set in the smart container, such as figure 2 shown.

[0069] Step 2: Extract the foreground image of the hot spot area from the user operation image by the frame difference method.

[0070] Step 201: Load the user operation image, and acquire the previous frame image and the next frame image of the current user operation image.

[0071] Step 202: Convert the current user operation image, the previous frame image, and the next frame image into grayscale images, respectively obtain the two-way grayscale difference between the current user operation image, the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an intelligent goods pickup recognition method based on depth vision, and relates to the technical field of machine vision, and the method comprises the steps: obtaining a useroperation image, and extracting a hot spot region foreground image from the user operation image; inputting the hotspot region foreground image into a handheld region target detection convolutional neural network and outputting a handheld article region screenshot; inputting the hand-held article area screenshot into an article target segmentation convolutional neural network and outputting an article segmentation image group; inputting the article segmentation graph group into an article target classification convolutional neural network and outputting an article classification result, wherein the article classification result comprises a hand region and the category and number of each article; and respectively performing target tracking on the user operation actions acquired by the camera, outputting each user operation time sequence detection result, performing comprehensive analysis, and outputting the category and the quantity of the articles taken out by the operation. The operation behavior of the user is intelligently detected and recognized through the deep vision video image analysis technology, and the object recognition precision is improved.

Description

technical field [0001] The invention relates to the technical field of machine vision, in particular to an intelligent pick-up recognition method based on depth vision. Background technique [0002] In material management, store retailing and other occasions, the container is an essential item display device. In the 1990s, vending machines were introduced to China from Europe, the United States, and Japan. Traditional vending machines are mainly hardware-driven. Users use banknotes and coins to pay, and containers pop out goods through springs. However, traditional vending machine companies have failed to effectively solve the cost. , quality, operation and many other problems, so not only the number of vending machines in the domestic market is small, but also the variety is very single, mainly bottled beverage vending machines. [0003] In recent years, under the blessing of the new retail trend, unmanned shelves have ushered in the trend. Most of the unmanned shelves ar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/20G06K9/34G06N3/04G06N3/08
CPCG06N3/08G06V40/28G06V20/48G06V20/41G06V10/22G06V10/267G06N3/045
Inventor 丁发展姜鹏
Owner 无锡雪浪数制科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products